STCL Assignment

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

Name: Sahil Ratnaparkhi

Roll no: 431

What is machine learning?

Machine learning (ML) is a type of artificial intelligence (AI) that allows software
applications to become more accurate at predicting outcomes without being explicitly
programmed to do so. Machine learning algorithms use historical data as input to predict new
output values.

Recommendation engines are a common use case for machine learning. Other popular uses
include fraud detection, spam filtering, malware threat detection, business process
automation (BPA) and predictive maintenance.

Why is machine learning important?

Machine learning is important because it gives enterprises a view of trends in customer


behaviour and business operational patterns, as well as supports the development of new
products. Many of today's leading companies, such as Facebook, Google and Uber, make
machine learning a central part of their operations. Machine learning has become a significant
competitive differentiator for many companies.

What are the different types of machine learning?

Classical machine learning is often categorized by how an algorithm learns to become more
accurate in its predictions. There are four basic approaches: supervised learning,
unsupervised learning, semi-supervised learning and reinforcement learning. The type of
algorithm data scientists chooses to use depends on what type of data they want to predict.

1. Supervised learning: In this type of machine learning, data scientists supply


algorithms with labelled training data and define the variables they want the algorithm
to assess for correlations. Both the input and the output of the algorithm is specified.

2. Unsupervised learning: This type of machine learning involves algorithms that train
on unlabelled data. The algorithm scans through data sets looking for any meaningful
connection. The data that algorithms train on as well as the predictions or
recommendations they output are predetermined.

3. Reinforcement learning: Data scientists typically use reinforcement learning to


teach a machine to complete a multi-step process for which there are clearly defined
rules. Data scientists program an algorithm to complete a task and give it positive or
negative cues as it works out how to complete a task. But for the most part, the
algorithm decides on its own what steps to take along the way.
Artificial Intelligence

Artificial intelligence (AI) refers to the simulation of human intelligence in machines that are
programmed to think like humans and mimic their actions. The term may also be applied to
any machine that exhibits traits associated with a human mind such as learning and problem-
solving.

The ideal characteristic of artificial intelligence is its ability to rationalize and take actions
that have the best chance of achieving a specific goal. A subset of artificial intelligence is
machine learning, which refers to the concept that computer programs can automatically
learn from and adapt to new data without being assisted by humans. Deep learning techniques
enable this automatic learning through the absorption of huge amounts of unstructured data
such as text, images, or video.

Categorization of Artificial Intelligence

Artificial intelligence can be divided into two different categories: weak and strong. Weak
artificial intelligence embodies a system designed to carry out one particular job. Weak AI
systems include video games such as the chess example from above and personal assistants
such as Amazon's Alexa and Apple's Siri. You ask the assistant a question, it answers it for
you.

Strong artificial intelligence systems are systems that carry on the tasks considered to be
human-like. These tend to be more complex and complicated systems. They are programmed
to handle situations in which they may be required to problem solve without having a person
intervene. These kinds of systems can be found in applications like self-driving cars or in
hospital operating rooms.
Internet Of Things

What is the Internet of Things?

The Internet of Things, or IoT, refers to the billions of physical devices around the world that
are now connected to the internet, all collecting and sharing data. Thanks to the arrival of
super-cheap computer chips and the ubiquity of wireless networks, it's possible to turn
anything, from something as small as a pill to something as big as an aeroplane, into a part of
the IoT. Connecting up all these different objects and adding sensors to them adds a level of
digital intelligence to devices that would be otherwise dumb, enabling them to communicate
real-time data without involving a human being. The Internet of Things is making the fabric
of the world around us smarter and more responsive, merging the digital and physical
universes.

What is an example of an Internet of Things device?

Pretty much any physical object can be transformed into an IoT device if it can be connected
to the internet to be controlled or communicate information.

A lightbulb that can be switched on using a smartphone app is an IoT device, as is a motion
sensor or a smart thermostat in your office or a connected streetlight. An IoT device could be
as fluffy as a child's toy or as serious as a driverless truck. Some larger objects may
themselves be filled with many smaller IoT components, such as a jet engine that's now filled
with thousands of sensors collecting and transmitting data back to make sure it is operating
efficiently. At an even bigger scale, smart cities projects are filling entire regions with sensors
to help us understand and control the environment. 

Applications

IoT applications run on IoT devices and can be created to be specific to almost every
industry and vertical, including healthcare, industrial automation, smart homes and buildings,
automotive, and wearable technology. Increasingly, IoT applications are using AI and
machine learning to add intelligence to devices.
Cyber Security

Cyber security is the practice of defending computers, servers, mobile devices, electronic
systems, networks, and data from malicious attacks. It's also known as information
technology security or electronic information security. The term applies in a variety of
contexts, from business to mobile computing, and can be divided into a few common
categories.

·        Network security is the practice of securing a computer network from intruders, whether
targeted attackers or opportunistic malware.

·        Application security focuses on keeping software and devices free of threats. A


compromised application could provide access to the data its designed to protect. Successful
security begins in the design stage, well before a program or device is deployed.

·        Information security protects the integrity and privacy of data, both in storage and in
transit.

·        Operational security includes the processes and decisions for handling and protecting data
assets. The permissions users have when accessing a network and the procedures that
determine how and where data may be stored or shared all fall under this umbrella.

·        Disaster recovery and business continuity define how an organization responds to a cyber-
security incident or any other event that causes the loss of operations or data. Disaster
recovery policies dictate how the organization restores its operations and information to
return to the same operating capacity as before the event. Business continuity is the plan the
organization falls back on while trying to operate without certain resources.

·        End-user education addresses the most unpredictable cyber-security factor: people. Anyone
can accidentally introduce a virus to an otherwise secure system by failing to follow good
security practices. Teaching users to delete suspicious email attachments, not plug in
unidentified USB drives, and various other important lessons is vital for the security of any
organization.
Block Chain

A Block chain is a chain of blocks which contain information. The data which is stored inside
a block depends on the type of block chain.
Block chain technology is most simply defined as a decentralized, distributed ledger that
records the provenance of a digital asset. Block chain is most simply defined as a
decentralized, distributed ledger technology that records the provenance of a digital asset.
Block chains are typically managed by a peer-to-peer network for use as a publicly
distributed ledger, where nodes collectively adhere to a protocol to communicate and validate
new blocks.

Block chain increases trust, security, transparency, and the traceability of data shared across a


business network and delivers cost savings with new efficiencies.
Block chain technology solves key issues like trust in a network. By changing the key
parameters, trust, any organization can focus on solving the problems at hand. Global
governments have also understood its importance and are keen on implementing block chain
technology. For example, Dubai Smart City 2020 is a project which aims to build a smart city
with new technologies, including block chain.

 Advantages: -
1.Better Transparency
2.Enhanced Security
3.Reduced Costs
4.True Traceability
5. Improved Speed and Highly Efficient

For Example: - A Bitcoin Block contains information about the Sender, Receiver, number of
bitcoins to be transferred. The first block in the chain is called the Genesis block.
Cloud Computing

What Is Cloud Computing?

Cloud computing is the delivery of different services through the Internet. These resources
include tools and applications like data storage, servers, databases, networking, and software.

Rather than keeping files on a proprietary hard drive or local storage device, cloud-based
storage makes it possible to save them to a remote database. As long as an electronic device
has access to the web, it has access to the data and the software programs to run it.

Cloud computing is a popular option for people and businesses for a number of reasons
including cost savings, increased productivity, speed and efficiency, performance, and
security.

Types of Cloud Services

Regardless of the kind of service, cloud computing services provide users with a series of
functions including:

 Email
 Storage, backup, and data retrieval
 Creating and testing apps
 Analysing data
 Audio and video streaming
 Delivering software on demand
Green Computing

 Green computing is the environmentally responsible and eco-friendly use of


computers and their resources. In broader terms, it is also defined as the study of
designing, engineering, manufacturing, using and disposing of computing devices
in a way that reduces their environmental impact. 
 
 Many IT manufacturers and vendors are continuously investing in designing
energy-efficient computing devices, reducing the use of dangerous materials and
encouraging the recyclability of digital devices. Green computing practices came
into prominence in 1992, when the Environmental Protection Agency (EPA)
launched the Energy Star program. 
 
To promote green computing concepts at all possible levels, the following four approaches
are employed: 
 Green use: Minimizing the electricity consumption of computers and their
peripheral devices and using them in an eco-friendly manner 
 Green disposal: Repurposing existing equipment or appropriately disposing of,
or recycling, unwanted electronic equipment 
 Green design: Designing energy-efficient computers, servers, printers,
projectors and other digital devices 
 Green manufacturing: Minimizing waste during the manufacturing of
computers and other subsystems to reduce the environmental impact of these
activities 
 Advantages:-   
 Conservation of resources means less energy is required to produce, use and
dispose of products.  
 Saving energy and resources saves money.  

You might also like