--> Sayadasite: DIGITAL FLUENCY QUESTIONS AND ANSWERS MODULE 1

Multiple Ads

Search

Menu Bar

DIGITAL FLUENCY QUESTIONS AND ANSWERS MODULE 1

ARTIFICIAL INTELLIGENCE, MACHINE LEARNING, DEEP LEARNING

 Questions and Answers

 What is Artificial Intelligence?

Artificial Intelligence is when software or a particular model developed can perform complex tasks on its own without requiring any assistance from humans. Artificial Intelligence is a field of study consisting of various sub-fields, including machine learning, deep learning, neural networks, computer vision, natural language processing, and so much more.

How powerful is AI?

The power of AI depends on the capability of the researcher working on the computation of the program. As of now, AI is quite powerful to solve a set of tasks that is assigned to it efficiently and effectively. However, it hasn’t reached its peak yet, and we are a few years away from that point.

Will AI steal our jobs?

The demand for skilled AI specialists is growing faster like never before. Requirements and open positions for experts in the sub-fields of AI like machine learning, deep learning, computer vision, statistics, and natural language processing are growing each day. So, AI will pave the way for more jobs for humans to control them. Humans are intellectual beings. Hence, AI will simplify the complexity of human work but won’t actually take away our jobs.

Can AI take over the world?

Artificial Intelligence has come a long way and developed into a unique feature of the modern world. Despite the advancements in AI, most of the tasks are still done under some kind of human supervision in the working or the development stages.

Artificial Intelligence is also limited to be the particular task that it is programmed to complete. So, as of today, AI taking over the world is unlikely.

What are the advantages of AI?

Apart from the massive job opportunities created by AI, it also has other advantages, such as the completion of looping or repetitive tasks that humans need to perform without making errors.

Artificial Intelligence, similar to a computer program, cannot tire and hence has the capacity to work all day long on a particular task until the desired results are accomplished.

They have the ability to perform faster computations compared to human speed on a wide range of problems with precise results. They also have tons of real-life applications to make our daily lives simpler.

What are the disadvantages of AI?

The construction of Artificial Intelligence models from scratch can sometimes be time-consuming and resourcefully exhaustive. Building such models may not be possible on a regular PC.

The deployment of Artificial Intelligence models can also be quite expensive in some cases. Also, the maintenance costs in case of malfunctioning of the AI models in peculiar cases can be annoying to deal with and solve.

AI cannot be used to accomplish more superior and intellectual tasks, as of today.

What are the applications of AI?

Artificial Intelligence in the natural world has a wide variety of applications. These include your journey from the start of the day till the end of the day. When you usually start your day with your smart phone, you make use of the AI capabilities of smart face lock or other fingerprint AI measures to unlock your phone.

 

Then you decide to Google something, you are greeted with AI features of autocomplete and autocorrect, which utilizes technologies of sequence to sequence modeling. Apart from smart phones, Artificial Intelligence has tons of other applications, including email spam detection, virtual assistants, chat bots, optical character recognition, and so much more.

Artificial Intelligence also finds its applications in many other fields, such as topics ranging from robotics, medical sciences, logistics and transportation, finances, and tons more utility services in industries.

Do you need to be a genius to start learning AI?

No, not necessarily. Artificial Intelligence is a field containing a lot of sub-fields. It is worth investing your precious time to gain further knowledge in the subject of AI if you are particularly interested in the various intriguing concepts that are offered by learning AI.

While learning AI from scratch might sometimes be hard at the beginning, it becomes more interesting and cool as you proceed to invest more time learning numerous concepts related to AI. You will gain exposure to mathematics, programming, machine learning, and so much more that will expand your vast knowledge.

Even if you find that the field of Artificial Intelligence is not suitable for your particular interests, it is still totally fine as long as you learn something about the numerous topics of AI.

The knowledge you gain from learning AI can be partially or completely utilized for various software applications and jobs as well.

How will artificial intelligence affect healthcare?

AI can lead to better care outcomes and improve the productivity and efficiency of care delivery. It can also improve the day-to-day life of healthcare practitioners, letting them spend more time looking after patients and in so doing, raise staff morale and improve retention

Which is the best application of AI in the healthcare sector?

Some of the beneficial applications of AI for healthcare purposes would be administrative workflows, image analysis, robotic surgery, virtual assistants, and clinical decision support

Will AI in healthcare make doctors redundant?

AI can enhance clinical productivity due to its ability to handle a large capacity of tasks that are well suited for automation. AI can reduce the burden of clerical work of physician's thus improving the quality of care and allow them to spend more time with patients and the healthcare team

What do you understand by the term robotics?

Robotics is a combined branch of engineering and science which deals with the study of development, operation, and control of intelligent robots. Robotics is a part of Artificial intelligence. Robotics technology is used for the development of machines which can perform a complex human task in a very efficient way.

What are various types of sensors used in the robotics?

Various types of sensors used in robots include light sensors, sound sensors, temperature sensors, proximity sensors, and acceleration and navigation sensors.

What can a digital assistant do?

A digital assistant pulls data from multiple sources and puts it into context. Advanced natural language processing gives it the ability to process what you are saying or typing. Advanced natural language understanding (NLU) gives it the ability to parse what you say or type and then generate accurate answers

Is a chat bot a digital assistant?

Data-driven and predictive (Conversational AI) chatbots are also called a Virtual Assistant or Digital Assistant. Apple's Siri and Amazon's Alexa are examples of consumer-oriented, data-driven, predictive AI chat bots

What is autonomous software?

An autonomous system is one that can achieve a given set of goals in a changing environment

Gathering information about the environment and working for an extended period of time without human control or intervention.

What technologies are needed for autonomous vehicles?

There are several critical technologies behind safe and efficient autonomous-vehicle operation—AI, safety and security, cameras, network infrastructure, and the sensor technologies radar and lidar, or laser-light radar.

Why are autonomous things important?

Autonomous things are fundamentally important because they represent the first real disconnection of machines from explicit human guidance. Humans are used to programming things, but are not used to them acting in autonomous ways. Self-driving vehicles are still making their way onto the roads.

Note: Trainer should encourage discussion on these FAQs and motivate students to come up with different answers

DATABASE MANAGEMENT FOR DATA SCIENCE, BIG DATA ANALYTICS

What is data science?

Data science is an interdisciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from noisy, structured and unstructured data, and apply knowledge and actionable insights from data across a broad range of application domains.

What is the need for Data Science?

The reason why we need data science is the ability to process and interpret data. This enables companies to make informed decisions around growth, optimization, and performance. Demand for skilled data scientists is on the rise now and in the next decade.

What is Data Science useful for?

Data science is a process that empowers better business decision-making through interpreting, modeling, and deployment. This helps in visualizing data that is understandable for business stakeholders to build future roadmaps and trajectories. Implementing Data Science for businesses is now a mandate for any business looking to grow.

How Facebook Uses Data Analytics To Understand Your Posts?

With 1.2 billion people uploading 136,000 photos and updating their status 293,000 times per minute on Facebook, it contributes to unstructured data (information which isn’t easily quantified and put into rows and tables for computer analysis).

Textual analysis - A large proportion of the data shared on Facebook is still text. Facebook uses a tool it developed itself called DeepText to extract meaning from words we post by learning to analyze them contextually. Neural networks analyze the relationship between words to understand how their meaning changes depending on other words around them. It learns for itself based on how words are used. It can easily switch between working across different human languages and apply what it has learned from one to another.It can easily switch between working across different human languages and apply what it has learned from one to another.

 

How Facebook Uses Data Analytics To Understand Your Posts And Recognize Your Face?

Facial recognition - Facebook uses a DL application called DeepFace to teach it to recognize people in photos. It says that its most advanced image recognition tool is more successful than humans in recognizing whether two different images are of the same person or not – with DeepFace scoring a 97% success rate compared to humans with 96%.

INTERNET OF THINGS (IOT) AND INDUSTRIAL INTERNET OF THINGS (IIOT)

What are the main parts of IoT systems?

IoT system consists of three main parts – Sensors, Network connectivity and Data storage applications.

What is security concerns related to IoT?

Data security and privacy are major concerns related to IoT. These devices are vulnerable to hacking and cloud endpoints could be used by hackers to attack servers. Software developers and device designers have to ensure adequate security and privacy measures.

Are IoT and digitization the same?

IoT is strictly an aspect of digitization (using data to drive a business), but in practice,                      the terms are used interchangeably with little consequence

What is a thing? What is not a thing?

Anything! Any tangible object in the real world can be a thing, but data must be retrievable from the thing for it to be considered IoT. In the crudest case, that might involve mounting a camera in front of some analogue dials and taking pictures to record activities

Can Alexa be part of IoT?

Yes. As an application user interface that is part of the IoT ecosystem, Alexa can be used to report or command IoT actions.

What are examples of smart devices?

The smartphone is an example of a smart device. It is a source of real-world data and a place to consume IoT insights. Other smart device examples include self-driving cars or sports and running watches, which collect and give feedback on runner data.

Why should we learn about IoT?

Learning about IoT helps IT decision-makers propose innovations that can drive improved business and personal outcomes. Implementing new solutions can save money and time, as well as improve safety and efficiency.

What is the role of artificial intelligence in IoT?

AI can be used to interpret data from IoT information obtained from the physical world, analyzed using data science and AI.

What are examples of IoT used in devices?

Examples include a robotic manufacturing machine, a physical environment sensor (like temperature, humidity, and light), or a remote-control light switch.

Another example involves measuring the health of each physical system on a car (engine, brakes, transmission, satellite navigation, etc.) and determining if maintenance can be delayed (saving money) or brought forward (avoiding breakdown or failure). By doing this, a positive experience of the car can be maintained. This may please the owner who does not change vehicles very often. For the planet's sake, maximizing the use of everything we make is essential.

 

What devices are part of IoT?

Almost everything we touch can be a part of IoT, but they must be able to provide information directly (from sensors) or indirectly (from video camera).

What is IoT programming?

IoT programming involves working with data to produce outcomes. Besides using programming languages, well-known data analysis frameworks used in data science have a major role to play.

Which language is better for IoT?

Language is use case-dependent. Python works and is a great starting place, as many know how to use it.

Is Python good for IoT?

Yes, Python is great for starting and experimenting with Raspberry Pi, for example. Arduino is another great platform for learning using C/C++.

Is coding required for IoT?

IoT does not always require coding. Phones driven by Alexa may download apps that require simple configuration. At home, I use Kasa and Any.do. With other examples, the app will consist of tools you configure to get answers.

CLOUD COMPUTING AND ITS SERVICE MODELS

What is cloud computing?

Cloud computing is described as the process of using a network of remote servers, hosted via the internet, to store, manage and process data, rather than hosting it locally. Essentially, cloud computing is using someone else's infrastructure and hardware, reducing the number of capital investments your business needs to make.

What are the types of cloud?

The three types of cloud computing are:

Public cloud is cloud computing that is delivered via the internet and shared across organizations.

Private cloud is cloud computing that is dedicated solely to your organization.

A hybrid cloud is an environment that uses both public and private clouds.

What are the various cloud models?

IaaS (Infrastructure as a service) is also known as Hardware as a Service (HaaS). It is a computing infrastructure managed over the internet. The main advantage of using IaaS is that it helps users to avoid the cost and complexity of purchasing and managing the physical servers.

PaaS (Platform as a Service) cloud computing platform is created for the programmer to develop, test, run, and manage the applications.

SaaS (Software as a Service) is also known as “on-demand software”. It is software in which the applications are hosted by a cloud service provider. Users can access these applications with the help of an internet connection and a web browser.

How does Cloud storage work?

With Cloud storage, the files and data you need are placed on highly secure remote systems stored in a provider's facility rather than on your computer's hard drive or local server. Internet access allows you to connect your computer or device to the remote cloud solution to retrieve what you need.

How do vendors charge for cloud services?

There are different pricing mechanisms for cloud service providers depending on the usage, subscription-based, dynamic usage etc. There is also a terminology called as “PAYG” model i.e., Pay as you Go, which means Pay for what you have used and terminate the services when you do not need them. This works out extremely well for cloud vendors to incur low hardware costs and operations cost through outsourcing.

What are the benefits of cloud computing?

The adoption of cloud computing by organizations has increased exponentially in recent years, due to the efficiency and cost-saving benefits that this computing model promises to deliver, which is especially appealing to organizations with limited IT staff and/or limited IT budgets. Independent Software Vendors (ISV's) in particular, can also substantially benefit from the cloud computing model for the delivery of software applications as a service, which offers many operational and administrative cost saving advantages over the traditional model of on-premise software delivery for software providers.

Is the cloud safe for personal information?

With numerous high profile hacks of personal information in recent years (most notably Target and Anthem) cloud providers have worked to step up the security game for personal information. If a cloud provider has certifications in place for HIPAA, PCI-DSS and SOC, they havebeen evaluated by a third party and deemed qualified to handle personal/ private information.

CYBER SECURITY

What is Cyber Security?

 Cyber security refers to the specialization of computer network security that consists of technologies, policies, and procedures that protect networked computer systems from unauthorized use or harm.

 Why do we need Cyber Security?

 The increasing reliance of our information-age economies and governments on cyber (computer-based) infrastructure makes them progressively more vulnerable to cyber attacks on our computer systems, networks, and data.

 In their most disruptive form, cyber attacks target the enterprise, government, military, or other infrastructural assets of a nation or its citizens.

 Both the volume and sophistication of cyber threats (cyber warfare, cyber terrorism, cyber espionage and malicious hacking) are increasing, and pose potent threats to our enterprise, government, military, or other infrastructural assets.

What is a Cyber Attack?

 An offensive action by a malicious actor that is intended to undermine the functions of networked computers and their related resources, including unauthorized access, unapproved changes, and malicious destruction. Examples of cyber attacks include Distributed Denial of Service (DDoS) and Man-in-the-Middle (MITM) attacks.

 What are the differences among the terms cyber attack, cyber threat & cyber risk?

 The terms cyber attack, cyber threat, and cyber risk are interrelated as follows. A cyber attack is an offensive action, whereas a cyber threat is the possibility that a particular attack may occur, and the cyber risk associated with the subject threat estimates the probability of potential losses that may result.

 For example, a Distributed Denial of Service (DDoS) cyber attack by a botnet is a cyber threat for many enterprises with online retail websites, where the associated cyber risk is a function of lost revenues due to website downtime and the probability that a DDoS cyber attack will occur

 

 

 

What is malware?

 Malware is an umbrella term derived from "malicious software", and refers to any software that is intrusive (unauthorized access), disruptive, or destructive to computer systems and networks. Malware may take many forms (executable code, data files) and includes, but is not limited to, computer viruses, worms, trojan horses (trojans), bots (botnets), spyware (system monitors, adware, tracking cookies), rogueware (scareware, ransomware), and other malicious programs. The majority of active malware threats are usually worms or trojans rather than viruses

What is cyber hygiene?

 Cyber is a colloquial term that refers to best practices and other activities that computer system administrators and users can undertake to improve their cyber security while engaging in common online activities, such as web browsing, emailing, texting, etc

What is cyberspace?

 Cyberspace is the virtual environment that consists of computer systems and networks, where all computers communicate via networks and all networks are connected. The term originated in science fiction during the 1980s and became popular during the 1990s. More recently computer vendors are attempting to brand cyberspace as the "Internet of Things" (IoT).

 What is a firewall?

 A firewall is a network security system that monitors incoming and outgoing network message traffic and prevents the transmission of malicious messages based on an updatable rule set. In effect, a firewall establishes a barrier between a trusted, secure internal network and external networks (e.g., the Internet) that are assumed to be untrustworthy and non-secure. Firewalls can be implemented as software that runs on general-purpose hardware (e.g., an open source firewall on a Windows PC or Mac OS X computer) or a dedicated hardware device (appliance).

How does a firewall work?

 Firewalls function as a filter between a trusted, secure internal network and external networks (e.g., the Internet) that are assumed to be untrustworthy and non-secure. The firewall filter may be flexibly programmed to control what information packets are allowed and blocked.

 What is anti-virus software?

 Anti-virus software, also known as, anti-malware software, is computer software used to scan files to identify and eliminate malicious software (malware). Although anti-virus software was originally developed to detect and remove computer viruses (hence its name), it has been broadened in scope to detect other malware, such as worms, Trojan horses, adware, spyware, ransom-ware, etc.

 How does anti-virus software work?

 Anti-virus software typically uses two different techniques to identify and eliminate malware:

 Virus dictionary approach: The anti-virus software scans a file while referring to a dictionary of known virus signatures that have been previously identified. If a code segment in the file matches any virus signature in the virus dictionary, then the anti-virus software performs one or more of the following operations: deletes the file; quarantines the file so that it is unable to spread; or attempts to repair the file by removing the virus from the file.

 

 Suspicious behavior approach: The anti-virus software monitors the behavior of all programs, flagging suspicious behavior, such as one executing program attempting to write date to another executable program. The user is alerted to all suspicious behavior, and is queried regarding how the suspicious behavior should be handled.

 What is a Unified Threat Management (UTM) system and how does it work?

 A Unified Threat Management (UTM) provides multiple security services in a single device or service on a network. UTM security services can include, but are not limited to:

 Scanning incoming date using Deep Packet Inspection (DPI) to secure the network from viruses and other malware;

 Filtering website URLs to prevent access to malicious websites;

 Ensuring operating systems, applications, and Anti-Virus software are updated automatically with the latest patches and security updates

What is the relation between cyber security and cryptography?

 Cyber security defenses are typically based on strong authentication and encryption

Techniques (cryptography techniques), cryptography is a key enabling technology for cyber security. In other words, cryptography helps to implement cyber security.

 

 


No comments: