--> Sayadasite: Overview of Emerging Technologies I II AND III

Multiple Ads

Search

Menu Bar

Overview of Emerging Technologies I II AND III

I. Artificial Intelligence, Machine Learning, Deep Learning,

II. Database Management for Data Science, Big Data Analytics

III. Internet of Things (IoT) and Industrial Internet of Things (IIoT)

Define Digital

What exactly does digital mean?

Digital describes electronic technology that generates, stores, and processes data in terms of two states: positive and non-positive or on/off in 1 and 0.

What is Digital Fluency?

Digital fluency is the aptitude to effectively and ethically interpret information, discover meaning, design content, construct knowledge, and communicate ideas in a digitally connected world.

Overview of Emerging Technologies:

Emerging technologies are those technical innovations which represent progressive developments within a field for competitive advantage; 

Define Emerging Technologies?

Study new computer science technologies like artificial intelligence, data analytics, and machine learning.

Emerging technologies include a variety of technologies such as educational technology, information technology, nanotechnology, biotechnology, cognitive science, robotics, and artificial intelligence.

Emerging Technologies Examples

Artificial intelligence (AI) is the branch of computer science that develops machines and software with animal-like intelligence. John McCarthy, who coined the term in 1956, defines it as "the study of making intelligent machines".

The central functions (or goals) of AI research include reasoning, knowledge, planning, learning, natural language processing (communication), perception and the ability to move and manipulate objects

Examples

Artificial Intelligence (AI) and Machine Learning. ...

Robotic Process Automation (RPA) ...

Edge Computing. ...

Quantum Computing. ...

Virtual Reality and Augmented Reality. ...

Blockchain. ...

Internet of Things (IoT) ...

5G.

I. Artificial Intelligence, Machine Learning, Deep Learning,

Introduction to AI

What is artificial intelligence?

Artificial intelligence (AI) is a wide-ranging branch of computer science concerned with building smart machines capable of performing tasks that typically require human intelligence.

According to the father of Artificial Intelligence, John McCarthy, it is “The science and engineering of making intelligent machines, especially intelligent computer programs”.

Artificial Intelligence is a way of making a computer, a computer-controlled robot, or a software think intelligently, in the similar manner the intelligent humans think.

AI is accomplished by studying how human brain thinks and how humans learn, decide, and work while trying to solve a problem, and then using the outcomes of this study as a basis of developing intelligent software and systems.

Why do we need artificial intelligence?

Artificial Intelligence is the simulation (imitation of a situation or process.) of the human process by machines (computer systems). These processes include the learning, reasoning, and self-correction.

We need Artificial Intelligence (AI) because the work that we need to do is increasing day-to-day. So it's a good idea to automate the routine work.

This saves the manpower of the organization and also increases the productivity.

Additionally, through this Artificial Intelligence, the company can also get the skilled the persons for the development of the company.

 

What are examples of artificial intelligence?

Siri, Alexa and other smart assistants

Self-driving cars

Robo-advisors

Conversational bots

Email spam filters

Netflix's recommendations

Since the invention of computers or machines, their capability to perform various tasks went on growing exponentially. Humans have developed the power of computer systems in terms of their diverse working domains, their increasing speed, and reducing size with respect to time.

A branch of Computer Science named Artificial Intelligence pursues creating the computers or machines as intelligent as human beings.

What are the four types of artificial intelligence?

Reactive Machines

Artificial intelligence machines programmed to provide a predictable (expected) output based on the input it receives. Reactive machines always respond to identical situations in the exact same way every time, and they are not able to learn actions or conceive of past or future.

Example

Deep Blue, the chess-playing IBM supercomputer that bested world champion Garry Kasparov

Limited Memory

Limited memory AI learns from the past and builds experiential knowledge by observing actions or data. This type of AI uses historical, observational data in combination with pre-programmed information to make predictions and perform complex classification tasks. It is the most widely-used kind of AI today.

Example

 Autonomous vehicles use limited memory AI to observe other cars’ speed and direction, helping them “read the road” and adjust as needed.

Theory of Mind

Want to hold a meaningful conversation with an emotionally intelligent robot that looks and sounds like a real human being? That’s on the horizon with theory of mind AI.

With this type of AI, machines will acquire true decision-making capabilities that are similar to humans. Machines with theory of mind AI will be able to understand and remember emotions, then adjust behavior based on those emotions as they interact with people.

Example

The Kismet robot head, developed by Professor Cynthia Breazeal, could recognize emotional signals on human faces and replicate those emotions on its own face. Humanoid robot Sophia, developed by Hanson Robotics in Hong Kong, can recognize faces and respond to interactions with her own facial expressions.

Self-Awareness

The most advanced type of artificial intelligence is self-aware AI. When machines can be aware of their own emotions, as well as the emotions of others around them, they will have a level of consciousness and intelligence similar to human beings. This type of AI will have desires, needs, and emotions as well.

Machine learning definition

Machine learning is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed.

Machine learning focuses on the development of computer programs that can access data and use it to learn for themselves.

Machine learning (ML) is the study of computer algorithms that can improve automatically through experience and by the use of data. Machine learning algorithms build a model based on sample data, known as "training data", in order to make decisions without being explicitly programmed to do so. 

Machine learning algorithms are used in a wide variety of applications, such as in medicine, email filtering, speech recognition, and computer vision, where it is difficult or unfeasible to develop conventional algorithms to perform the needed tasks.

 

Overview of Machine learning

Machine learning involves computers learning from data provided so that they carry out certain tasks. For simple tasks assigned to computers, it is possible to program algorithms telling the machine how to execute all steps required to solve the problem at hand; on the computer's part, no learning is needed. For more advanced tasks, it can be challenging for a human to manually create the needed algorithms. In practice, it can turn out to be more effective to help the machine develop its own algorithm, rather than having human programmers specify every needed step.

This can then be used as training data for the computer to improve the algorithm(s) it uses to determine correct answers.

For example, to train a system for the task of digital character recognition, the MNIST (Modified National Institute of Standards and Technology) dataset of handwritten digits has often been used.

·       Image recognition. Image recognition is a well-known and widespread example of machine learning in the real world. ...

·       Speech recognition. Machine learning can translate speech into text. ...

·       Medical diagnosis. ..

Deep learning definition

Deep learning is a type of machine learning and artificial intelligence (AI) that imitates the way humans gain certain types of knowledge. ... It is extremely beneficial to data scientists who are tasked with collecting, analyzing and interpreting large amounts of data; deep learning makes this process faster and easier.

Deep learning (also known as deep structured learning) is part of a broader family of machine learning methods based on artificial neural networks with representation learning. Learning can be supervised, semi-supervised or unsupervised.

Deep-learning architectures such as deep neural networks, deep belief networks, deep reinforcement learning, recurrent neural networks and convolution neural networks have been applied to fields including computer vision, speech recognition, natural language processing, machine translation, bioinformatics, drug design, medical image analysis,

Material inspection and board game programs, where they have produced results comparable to and in some cases surpassing human expert performance.

Definition

Representing Images on Multiple Layers of Abstraction in Deep Learning

Deep learning is a class of machine learning algorithms that: 199–200  uses multiple layers to progressively extract higher-level features from the raw input. For example, in image processing, lower layers may identify edges, while higher layers may identify the concepts relevant to a human such as digits or letters or faces.

Overview of Deep Learning

In deep learning, each level learns to transform its input data into a slightly more abstract and composite representation. In an image recognition application, the raw input may be a matrix of pixels; the first representational layer may abstract the pixels and encode edges; the second layer may compose and encode arrangements of edges; the third layer may encode a nose and eyes; and the fourth layer may recognize that the image contains a face.

Importantly, a deep learning process can learn which features to optimally place in which level on its own. This does not completely eliminate the need for hand-tuning;

for example, varying numbers of layers and layer sizes can provide different degrees of abstraction.

The word "deep" in "deep learning" refers to the number of layers through which the data is transformed.

Virtual assistants. ... (Eg Amazon Alexa.Google Assistant.)

·       Translations. ...

·       Vision for driverless delivery trucks, drones and autonomous cars. ...

·       Facial recognition. ...

·       Medicine and pharmaceuticals. ...

 

 

 

 

 

 

II. Database Management for Data Science, Big Data Analytics,

Database Management for Data Science

Data management is the practice of collecting, keeping, and using data securely, efficiently, and cost-effectively. ... Store data across multiple clouds and on premises.

A database is defined as a structured set of data held in a computer’s memory or on the cloud that is accessible in various ways.

Databases make structured storage secure, efficient, and fast. They provide a framework for how the data should be stored, structured, and retrieved. Having databases saves you the hassle (problem, difficulty and struggle) of needing to figure out what to do with your data in every new project.

Data science 

Data science is one of the fast-growing fields that combines the scientific method, math and statistics, specialized programming, advanced analytics (Analytics is the process of discovering, interpreting, and communicating significant patterns in data.)

Data science is all about data, collecting it, cleaning it, analyzing it, visualizing it, and using it to make our life better. Handling large amounts of data can be a challenging task for data scientists. Most of the time, that data we need to process and analyze is much larger than the capacity of our devices (the size of the RAM).

Data scientist

A data scientist is a professional responsible for collecting, analyzing and interpreting extremely large amounts of data. The data scientist role is an offshoot (member) of several traditional technical roles, including mathematician, scientist, statistician and computer professional.

As a data scientist, you will need to design, create, and interact with databases on most of the projects you will work on. Sometimes you will need to create everything from scratch, (To create something from scratch is to make it without any ingredients or materials prepared ahead of time) while at other times; you will just need to know how to communicate with an already existing database.

Big data analytics

What is Data?

The quantities, characters, or symbols on which operations are performed by a computer, which may be stored and transmitted in the form of electrical signals and recorded on magnetic, optical, or mechanical recording

What is Big Data?

Big Data is a collection of data that is huge in volume, yet growing exponentially with time. It is a data with so large size and complexity that none of traditional data management tools can store it or process it efficiently. Big data is also a data but with huge size.

What is big data analytics?

Big data analytics is the use of advanced analytic techniques against very large, diverse data sets that include structured, semi-structured and unstructured data, from different sources, and in different sizes from terabytes to zettabytes

Examples of big data analytics

Big data analytics helps businesses to get insights from today's huge data resources. People, organizations, and machines now produce massive amounts of data. 

Examples of Big Data

Stock exchanges

Social media sites

Jet engines

Cloud applications

Machine sensor data 

etc.

A jet engine is a machine that converts energy-rich, liquid fuel into a powerful pushing force called thrust. The thrust (push suddenly) from one or more ...

The New York Stock Exchange is an example of Big Data that generates about one terabyte of new trade data per day.

III. Internet of Things (IoT) and Industrial Internet of Things (IIoT)

Introduction to Internet of Things (IoT)

The Internet of Things, or IoT, is revolutionizing day-to-day business decision making and information gathering. Businesses can stream incoming data from connected devices, buildings, vehicles, wearables, and other devices that have sensors to optimize systems, help predict failures, improve efficiency, and create better outcomes.

The Internet of things (IoT) describes physical objects (or groups of such objects) that are embedded with sensors, processing ability, software, and other technologies, and that connect and exchange data with other devices and systems over the Internet or other communications networks.

The field has evolved due to the convergence of multiple technologies, including ubiquitous computing (ubiquitous - present, appearing, or found everywhere.), commodity sensors, increasingly powerful embedded systems, and machine learning. Traditional fields of embedded systems, wireless sensor networks, control systems, automation (including home and building automation), independently and collectively enable the Internet of things.

In the consumer market, IoT technology is most synonymous with products pertaining to the concept of the "smart home", including devices and appliances (such as lighting fixtures, thermostats, home security systems and cameras, and other home appliances) that support one or more common ecosystems, and can be controlled via devices associated with that ecosystem, such as smartphones and smart speakers. The IoT can also be used in healthcare systems.

There are a number of concerns about the risks in the growth of IoT technologies and products, especially in the areas of privacy and security, and consequently, industry and governmental moves to address these concerns have begun, including the development of international and local standards, guidelines, and regulatory frameworks.

What is IoT Internet of things and how does it work?

The Internet of Things (IoT) describes the network of physical objects—“things”—that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the internet.

Best IoT Examples in 2020

Home Security. The Internet of Things is the key driver behind a completely smart and secure home. ...

Activity Trackers. Smart home security cameras provide alerts and a peace of mind. ...

Digital Twins. ...

Self-Healing Machines. ...

AR Glasses. ...

Ingestible Sensors. ...

Smart Farming. ...

Smart Contact Lenses.

What is a digital twin?

A digital twin is a virtual model designed to accurately reflect a physical object. The object being studied — for example, a wind turbine — is outfitted ...

What are the types of digital twins?

Generally speaking, there are three types of digital twin – Product, Production, and Performance, which are explained below. The combination and integration of the three digital twins as they evolve together is known as the digital thread.

How AR is used today?

“Augmented Reality Smart Glasses are defined as wearable Augmented Reality (AR) devices that are worn like regular glasses and merge virtual information with physical information in a user's view field.”

Augmented (to make greater) reality is now used in medical training. Its applications range from MRI applications to performing highly delicate surgery. At the Cleveland Clinic at Case Western Reserve University, for example, students are taught the ins and outs of anatomy using AR headsets or augmented reality glasses.

How do ingestible sensors work?

An ingestible sensor embedded in the pill is able to record that the medication was taken – sending signals to a wearable patch that then transmits the data to a mobile app

Ingestible sensors—pill-sized electronics that ping your smartphone with data after you pop and swallow—have started to arrive on the market. They don't do much yet: Mostly they measure pH, temperature, and pressure or monitor whether or not patients have taken their meds.

IIoT Architecture

Purdue Enterprise Reference Architecture model

IoT Reference Model

Approximate correspondence between levels in the Purdue model and the basic structure of the IoT  The IIoT is enabled by technologies such as cybersecurity, cloud computing, edge computing, mobile technologies, machine-to-machine, 3D printing, advanced robotics, big data, internet of things, RFID technology, and cognitive computing.

Five of the most important ones are described below:

Cyber-physical systems (CPS): the basic technology platform for IoT and IIoT and therefore the main enabler to connect physical machines that were previously disconnected. CPS integrates the dynamics of the physical process with those of software and communication, providing abstractions and modeling, design, and analysis techniques.

Cloud computing: With cloud computing IT services and resources can be uploaded to and retrieved from the Internet as opposed to direct connection to a server. Files can be kept on cloud-based storage systems rather than on local storage devices.

Edge computing: A distributed computing paradigm which brings computer data storage closer to the location where it is needed. In contrast to cloud computing, edge computing refers to decentralized data processing at the edge of the network. The industrial internet requires more of an edge-plus-cloud architecture rather than one based on purely centralized cloud; in order to transform productivity, products and services in the industrial world.

Big data analytics: Big data analytics is the process of examining large and varied data sets, or big data.

Artificial intelligence and machine learning: Artificial intelligence (AI) is a field within computer science in which intelligent machines are created that work and react like humans. Machine learning is a core part of AI, allowing software to more accurately predict outcomes without explicitly being programmed.

What is industry 4.0 The Industrial Internet of things IIoT )?

Industry 4.0, also sometimes referred to as IIoT or smart manufacturing, marries physical production and operations with smart digital technology, machine learning, and big data to create a more holistic and better connected ecosystem for companies that focus on manufacturing and supply chain management.

Introduction to Industrial internet of things

The industrial internet of things (IIoT) refers to interconnected sensors, instruments, and other devices networked together with computers' industrial applications, including manufacturing and energy management.

This connectivity allows for data collection, exchange, and analysis, potentially facilitating improvements in productivity and efficiency as well as other economic benefits. The IIoT is an evolution of a distributed control system (DCS) that allows for a higher degree of automation by using cloud computing to refine and optimize the process controls.

What is Industrial Internet of Things examples?

In Industrial IoT use cases, smart devices may be deployed in construction vehicles, supply chain robotics, solar and wind power, agricultural sensor systems, smart irrigation, and more. These IIoT applications tend to have one thing in common: they are deployed in challenging environments.

How is industrial Internet of things IIoT different from the Internet of Things IoT?

The only difference between those two is their general usages. While IoT is most commonly used for consumer usage, IIoT is used for industrial purpose such as manufacturing, supply chain monitor and management system.

 


1 comment:

Sis said...

Updated you dite