CS6: System Security, ICT Ethical Issues and Emerging Technologies

This unit explains System Security, ICT Ethical Issues and Emerging Technologies

Computer System Security

Computer security, cybersecurity or information technology security (IT security) is the protection of computer systems from theft or damage to their hardware, software or electronic data, as well as from disruption or misdirection of the services they provide.

The field is growing in importance due to increasing reliance on computer systems, the Internet and wireless networks such as Bluetooth and Wi-Fi, and due to the growth of “smart” devices, including smartphones, televisions and the various tiny devices that constitute the Internet of things. Due to its complexity, both in terms of politics and technology, it is also one of the major challenges of the contemporary world.

Vulnerabilities and attacks

A vulnerability is a weakness in design, implementation, operation or internal control. Most of the vulnerabilities that have been discovered are documented in the Common Vulnerabilities and Exposures (CVE) database.

An exploitable vulnerability is one for which at least one working attack or “exploit” exists. Vulnerabilities are often hunted or exploited with the aid of automated tools or manually using customized scripts.

To secure a computer system, it is important to understand the attacks that can be made against it, and these threats can typically be classified into one of these categories below:

Backdoor

A backdoor in a computer system, a cryptosystem or an algorithm, is any secret method of bypassing normal authentication or security controls. They may exist for a number of reasons, including by original design or from poor configuration. They may have been added by an authorized party to allow some legitimate access or by an attacker for malicious reasons; but regardless of the motives for their existence, they create vulnerability.

Denial-of-service attacks

Denial of service attacks (DoS) are designed to make a machine or network resource unavailable to its intended users. Attackers can deny service to individual victims, such as by deliberately entering a wrong password enough consecutive times to cause the victims account to be locked, or they may overload the capabilities of a machine or network and block all users at once.

While a network attack from a single IP address can be blocked by adding a new firewall rule, many forms of Distributed denial of service (DDoS) attacks are possible, where the attack comes from a large number of points – and defending is much more difficult. Such attacks can originate from the zombie computers of a botnet, but a range of other techniques are possible including reflection and amplification attacks, where innocent systems are fooled into sending traffic to the victim.

Direct-access attacks

An unauthorized user gaining physical access to a computer is most likely able to directly copy data from it. They may also compromise security by making operating system modifications, installing software worms, key loggers, covert listening devices or using wireless mice. Even when the system is protected by standard security measures, these may be able to be by-passed by booting another operating system or tool from a CD-ROM or other bootable media. Disk encryption and Trusted Platform Module are designed to prevent these attacks.

Eavesdropping

Eavesdropping is the act of surreptitiously listening to a private conversation, typically between hosts on a network. For instance, programs such as Carnivore and NarusInSight have been used by the FBI and NSA to eavesdrop on the systems of internet service providers. Even machines that operate as a closed system (i.e., with no contact to the outside world) can be eavesdropped upon via monitoring the faint electromagnetic transmissions generated by the hardware; TEMPEST is a specification by the NSA referring to these attacks.

Multi-vector, polymorphic attacks

Surfacing in 2017, a new class of multi-vector, polymorphic cyber threats surfaced that combined several types of attacks and changed form to avoid cyber security controls as they spread. These threats have been classified as fifth generation cyber-attacks.

Phishing

Phishing is the attempt to acquire sensitive information such as usernames, passwords, and credit card details directly from users. Phishing is typically carried out by email spoofing or instant messaging and it often directs users to enter details at a fake website whose look and feel are almost identical to the legitimate one. The fake website often asks for personal information, such as log-in and passwords. This information can then be used to gain access to the individual’s real account on the real website. Preying on a victim’s trust, phishing can be classified as a form of social engineering.

Privilege escalation

Privilege escalation describes a situation where an attacker with some level of restricted access is able to, without authorization, elevate their privileges or access level. For example, a standard computer user may be able to fool the system into giving them access to restricted data; or even become “root” and have full unrestricted access to a system.

Social engineering

Social engineering aims to convince a user to disclose secrets such as passwords, card numbers, etc. by, for example, impersonating a bank, a contractor, or a customer.

A common scam involves fake CEO emails sent to accounting and finance departments. In early 2016, the FBI reported that the scam has cost US businesses more than $2bn in about two years.

In May 2016, the Milwaukee Bucks NBA team was the victim of this type of cyber scam with a perpetrator impersonating the team’s president Peter Feigin, resulting in the handover of all the team’s employees’ 2015 W-2 tax forms.

Spoofing

Spoofing is the act of masquerading as a valid entity through falsification of data (such as an IP address or username), in order to gain access to information or resources that one is otherwise unauthorized to obtain. There are several types of spoofing, including:

  • Email spoofing, where an attacker forges the sending (From, or source) address of an email.
  • IP address spoofing, where an attacker alters the source IP address in a network packet to hide their identity or impersonate another computing system.
  • MAC spoofing, where an attacker modifies the Media Access Control (MAC) address of their network interface to pose as a valid user on a network.
  • Biometric spoofing, where an attacker produces a fake biometric sample to pose as another user.

Tampering

Tampering describes a malicious modification of products. So-called “Evil Maid” attacks and security services planting of surveillance capability into routers are examples.

THIS VIDEO EXPLAINS COMPUTER SYSTEM SECURITY

Privacy and ICT Ethical Issues

In computer security a countermeasure is an action, device, procedure, or technique that reduces a threat, a vulnerability, or an attack by eliminating or preventing it, by minimizing the harm it can cause, or by discovering and reporting it so that corrective action can be taken. Some common countermeasures are listed in the following sections:

Security by design

Security by design, or alternately secure by design, means that the software has been designed from the ground up to be secure. In this case, security is considered as a main feature.

Some of the techniques in this approach include:

  • The principle of least privilege, where each part of the system has only the privileges that are needed for its function. That way even if an attacker gains access to that part, they have only limited access to the whole system.
  • Automated theorem proving to prove the correctness of crucial software subsystems.
  • Code reviews and unit testing, approaches to make modules more secure where formal correctness proofs are not possible.
  • Defense in depth, where the design is such that more than one subsystem needs to be violated to compromise the integrity of the system and the information it holds.
  • Default secure settings, and design to “fail secure” rather than “fail insecure” (see fail-safe for the equivalent in safety engineering). Ideally, a secure system should require a deliberate, conscious, knowledgeable and free decision on the part of legitimate authorities in order to make it insecure.
  • Audit trails tracking system activity, so that when a security breach occurs, the mechanism and extent of the breach can be determined. Storing audit trails remotely, where they can only be appended to, can keep intruders from covering their tracks.
  • Full disclosure of all vulnerabilities, to ensure that the “window of vulnerability” is kept as short as possible when bugs are discovered.

Security architecture

The Open Security Architecture organization defines IT security architecture as “the design artifacts that describe how the security controls (security countermeasures) are positioned, and how they relate to the overall information technology architecture. These controls serve the purpose to maintain the system’s quality attributes: confidentiality, integrity, availability, accountability and assurance services”.

Techopedia defines security architecture as “a unified security design that addresses the necessities and potential risks involved in a certain scenario or environment. It also specifies when and where to apply security controls. The design process is generally reproducible.” The key attributes of security architecture are:

  • the relationship of different components and how they depend on each other.
  • the determination of controls based on risk assessment, good practice, finances, and legal matters.
  • the standardization of controls.

Security measures

A state of computer “security” is the conceptual ideal, attained by the use of the three processes: threat prevention, detection, and response. These processes are based on various policies and system components, which include the following:

  • User account access controls and cryptography can protect systems files and data, respectively.
  • Firewalls are by far the most common prevention systems from a network security perspective as they can (if properly configured) shield access to internal network services, and block certain kinds of attacks through packet filtering. Firewalls can be both hardware- or software-based.
  • Intrusion Detection System (IDS) products are designed to detect network attacks in-progress and assist in post-attack forensics, while audit trails and logs serve a similar function for individual systems.
  • “Response” is necessarily defined by the assessed security requirements of an individual system and may cover the range from simple upgrade of protections to notification of legal authorities, counter-attacks, and the like. In some special cases, a complete destruction of the compromised system is favored, as it may happen that not all the compromised resources are detected.

Today, computer security comprises mainly “preventive” measures, like firewalls or an exit procedure. A firewall can be defined as a way of filtering network data between a host or a network and another network, such as the Internet, and can be implemented as software running on the machine, hooking into the network stack (or, in the case of most UNIX-based operating systems such as Linux, built into the operating system kernel) to provide real-time filtering and blocking. Another implementation is a so-called “physical firewall”, which consists of a separate machine filtering network traffic. Firewalls are common amongst machines that are permanently connected to the Internet.

Some organizations are turning to big data platforms, such as Apache Hadoop, to extend data accessibility and machine learning to detect advanced persistent threats.

However, relatively few organisations maintain computer systems with effective detection systems, and fewer still have organized response mechanisms in place. As a result, as Reuters points out: “Companies for the first time report they are losing more through electronic theft of data than physical stealing of assets”. The primary obstacle to effective eradication of cybercrime could be traced to excessive reliance on firewalls and other automated “detection” systems. Yet it is basic evidence gathering by using packet capture appliances that puts criminals behind bars.

Vulnerability management

Vulnerability management is the cycle of identifying, and remediating or mitigating vulnerabilities, especially in software and firmware. Vulnerability management is integral to computer security and network security.

Vulnerabilities can be discovered with a vulnerability scanner, which analyzes a computer system in search of known vulnerabilities such as open ports, insecure software configuration, and susceptibility to malware.

Beyond vulnerability scanning, many organizations contract outside security auditors to run regular penetration tests against their systems to identify vulnerabilities. In some sectors, this is a contractual requirement.

Reducing vulnerabilities

While formal verification of the correctness of computer systems is possible, it is not yet common. Operating systems formally verified include seL4, and SYSGO’s PikeOS – but these make up a very small percentage of the market.

Two factor authentication is a method for mitigating unauthorized access to a system or sensitive information. It requires “something you know”; a password or PIN, and “something you have”; a card, dongle, cellphone, or other piece of hardware. This increases security as an unauthorized person needs both of these to gain access.

Social engineering and direct computer access (physical) attacks can only be prevented by non-computer means, which can be difficult to enforce, relative to the sensitivity of the information. Training is often involved to help mitigate this risk, but even in a highly disciplined environments (e.g. military organizations), social engineering attacks can still be difficult to foresee and prevent.

Enoculation, derived from inoculation theory, seeks to prevent social engineering and other fraudulent tricks or traps by instilling a resistance to persuasion attempts through exposure to similar or related attempts.

It is possible to reduce an attacker’s chances by keeping systems up to date with security patches and updates, using a security scanner or/and hiring competent people responsible for security.(This statement is ambiguous. Even systems developed by “competent” people get penetrated) the effects of data loss/damage can be reduced by careful backing up and insurance.

Hardware protection mechanisms

While hardware may be a source of insecurity, such as with microchip vulnerabilities maliciously introduced during the manufacturing process, hardware-based or assisted computer security also offers an alternative to software-only computer security. Using devices and methods such as dongles, trusted platform modules, intrusion-aware cases, drive locks, disabling USB ports, and mobile-enabled access may be considered more secure due to the physical access (or sophisticated backdoor access) required in order to be compromised. Each of these is covered in more detail below.

  • USB dongles are typically used in software licensing schemes to unlock software capabilities, but they can also be seen as a way to prevent unauthorized access to a computer or other device’s software. The dongle, or key, essentially creates a secure encrypted tunnel between the software application and the key. The principle is that an encryption scheme on the dongle, such as Advanced Encryption Standard (AES) provides a stronger measure of security, since it is harder to hack and replicate the dongle than to simply copy the native software to another machine and use it. Another security application for dongles is to use them for accessing web-based content such as cloud software or Virtual Private Networks (VPNs). In addition, a USB dongle can be configured to lock or unlock a computer.
  • Trusted platform modules (TPMs) secure devices by integrating cryptographic capabilities onto access devices, through the use of microprocessors, or so-called computers-on-a-chip. TPMs used in conjunction with server-side software offer a way to detect and authenticate hardware devices, preventing unauthorized network and data access.
  • Computer case intrusion detection refers to a device, typically a push-button switch, which detects when a computer case is opened. The firmware or BIOS is programmed to show an alert to the operator when the computer is booted up the next time.
  • Drive locks are essentially software tools to encrypt hard drives, making them inaccessible to thieves.  Tools exist specifically for encrypting external drives as well.
  • Disabling USB ports is a security option for preventing unauthorized and malicious access to an otherwise secure computer. Infected USB dongles connected to a network from a computer inside the firewall are considered by the magazine Network World as the most common hardware threat facing computer networks.
  • Disconnecting or disabling peripheral devices (like camera, GPS, removable storage etc.), that are not in use.
  • Mobile-enabled access devices are growing in popularity due to the ubiquitous nature of cell phones. Built-in capabilities such as Bluetooth, the newer Bluetooth low energy (LE), Near field communication (NFC) on non-iOS devices and biometric validation such as thumb print readers, as well as QR code reader software designed for mobile devices, offer new, secure ways for mobile phones to connect to access control systems. These control systems provide computer security and can also be used for controlling access to secure buildings.

Secure operating systems

One use of the term “computer security” refers to technology that is used to implement secure operating systems. In the 1980s the United States Department of Defense (DoD) used the “Orange Book” standards, but the current international standard ISO/IEC 15408, “Common Criteria” defines a number of progressively more stringent Evaluation Assurance Levels. Many common operating systems meet the EAL4 standard of being “Methodically Designed, Tested and Reviewed”, but the formal verification required for the highest levels means that they are uncommon. An example of an EAL6 (“Semiformally Verified Design and Tested”) system is Integrity-178B, which is used in the Airbus A380 and several military jets.

Secure coding

In software engineering, secure coding aims to guard against the accidental introduction of security vulnerabilities. It is also possible to create software designed from the ground up to be secure. Such systems are “secure by design”. Beyond this, formal verification aims to prove the correctness of the algorithms underlying a system; important for cryptographic protocols for example.

THIS VIDEO EXPLAINS PRIVACY AND ICT ETHICAL ISSUES

Emerging Technologies

Emerging technologies are technologies whose development, practical applications, or both are still largely unrealized, such that they are figuratively emerging into prominence from a background of nonexistence or obscurity. These technologies are generally new but also include older technologies that are still controversial and relatively undeveloped in potential, such as preimplantation genetic diagnosis and gene therapy (which date to circa 1990 but even today have large undeveloped potential). Emerging technologies are often perceived as capable of changing the status quo.

Emerging technologies are characterized by radical novelty (in application even if not in origins), relatively fast growth, coherence, prominent impact, and uncertainty and ambiguity. In other words, an emerging technology can be defined as “a radically novel and relatively fast growing technology characterised by a certain degree of coherence persisting over time and with the potential to exert a considerable impact on the socio-economic domain(s) which is observed in terms of the composition of actors, institutions and patterns of interactions among those, along with the associated knowledge production processes. Its most prominent impact, however, lies in the future and so in the emergence phase is still somewhat uncertain and ambiguous.”

Emerging technologies include a variety of technologies such as educational technology, information technology, nanotechnology, biotechnology, cognitive science, psychotechnology, robotics, and artificial intelligence.

New technological fields may result from the technological convergence of different systems evolving towards similar goals. Convergence brings previously separate technologies such as voice (and telephony features), data (and productivity applications) and video together so that they share resources and interact with each other, creating new efficiencies.

Emerging technologies are those technical innovations which represent progressive developments within a field for competitive advantage; converging technologies represent previously distinct fields which are in some way moving towards stronger inter-connection and similar goals. However, the opinion on the degree of the impact, status and economic viability of several emerging and converging technologies varies.

History of emerging technologies

In the history of technology, emerging technologies are contemporary advances and innovation in various fields of technology.

Over centuries innovative methods and new technologies are developed and opened up. Some of these technologies are due to theoretical research, and others from commercial research and development.

Technological growth includes incremental developments and disruptive technologies. An example of the former was the gradual roll-out of DVD (digital video disc) as a development intended to follow on from the previous optical technology compact disc. By contrast, disruptive technologies are those where a new method replaces the previous technology and makes it redundant, for example, the replacement of horse-drawn carriages by automobiles and other vehicles.

Emerging technology debates

Many writers, including computer scientist Bill Joy, have identified clusters of technologies that they consider critical to humanity’s future. Joy warns that the technology could be used by elites for good or evil. They could use it as “good shepherds” for the rest of humanity or decide everyone else is superfluous and push for mass extinction of those made unnecessary by technology.[7]

Advocates of the benefits of technological change typically see emerging and converging technologies as offering hope for the betterment of the human condition. Cyberphilosophers Alexander Bard and Jan Söderqvist argue in The Futurica Trilogy that while Man himself is basically constant throughout human history (genes change very slowly), all relevant change is rather a direct or indirect result of technological innovation (memes change very fast) since new ideas always emanate from technology use and not the other way around. Man should consequently be regarded as history’s main constant and technology as its main variable. However, critics of the risks of technological change, and even some advocates such as trans humanist philosopher Nick Bostrom, warn that some of these technologies could pose dangers, perhaps even contribute to the extinction of humanity itself; i.e., some of them could involve existential risks.

Much ethical debate centers on issues of distributive justice in allocating access to beneficial forms of technology. Some thinkers, including environmental ethicist Bill McKibben, oppose the continuing development of advanced technology partly out of fear that its benefits will be distributed unequally in ways that could worsen the plight of the poor. By contrast, inventor Ray Kurzweil is among techno-utopians who believe that emerging and converging technologies could and will eliminate poverty and abolish suffering.[12]

Some analysts such as Martin Ford, author of The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future,[13] argue that as information technology advances, robots and other forms of automation will ultimately result in significant unemployment as machines and software begin to match and exceed the capability of workers to perform most routine jobs.

As robotics and artificial intelligence develop further, even many skilled jobs may be threatened. Technologies such as machine learning[14] may ultimately allow computers to do many knowledge-based jobs that require significant education. This may result in substantial unemployment at all skill levels, stagnant or falling wages for most workers, and increased concentration of income and wealth as the owners of capital capture an ever-larger fraction of the economy. This in turn could lead to depressed consumer spending and economic growth as the bulk of the population lacks sufficient discretionary income to purchase the products and services produced by the economy.

Examples

Artificial intelligence

Artificial intelligence (AI) is the sub intelligence exhibited by machines or software, and the branch of computer science that develops machines and software with animal-like intelligence. Major AI researchers and textbooks define the field as “the study and design of intelligent agents,” where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success. John McCarthy, who coined the term in 1942, defines it as “the study of making intelligent machines”.

The central problems (or goals) of AI research include reasoning, knowledge, planning, learning, natural language processing (communication), perception and the ability to move and manipulate objects. General intelligence (or “strong AI”) is still among the field’s long-term goals. Currently popular approaches include deep learning, statistical methods, computational intelligence and traditional symbolic AI. There is an enormous number of tools used in AI, including versions of search and mathematical optimization, logic, methods based on probability and economics, and many others.

3D Printing

3D printing, also known as additive manufacturing, has been posited by Jeremy Rifkin and others as part of the third industrial revolution.

Combined with Internet technology, 3D printing would allow for digital blueprints of virtually any material product to be sent instantly to another person to be produced on the spot, making purchasing a product online almost instantaneous.

Although this technology is still too crude to produce most products, it is rapidly developing and created a controversy in 2013 around the issue of 3D printed guns.

Gene therapy

Gene therapy was first successfully demonstrated in late 1990/early 1991 for adenosine deaminase deficiency, though the treatment was somatic – that is, did not affect the patient’s germ line and thus was not heritable. This led the way to treatments for other genetic diseases and increased interest in germ line gene therapy – therapy affecting the gametes and descendants of patients.

Between September 1990 and January 2014, there were around 2,000 gene therapy trials conducted or approved.

Cancer vaccines

cancer vaccine is a vaccine that treats existing cancer or prevents the development of cancer in certain high-risk individuals. Vaccines that treat existing cancer are known as therapeutic cancer vaccines. There are currently no vaccines able to prevent cancer in general.

On April 14, 2009, The Dendreon Corporation announced that their Phase III clinical trial of Provenge, a cancer vaccine designed to treat prostate cancer, had demonstrated an increase in survival. It received U.S. Food and Drug Administration (FDA) approval for use in the treatment of advanced prostate cancer patients on April 29, 2010. The approval of Provenge has stimulated interest in this type of therapy.

In vitro meat

In vitro meat, also called cultured meat, clean meat, cruelty-free meat, shmeat, and test-tube meat, is an animal-flesh product that has never been part of a living animal with exception of the fetal calf serum taken from a slaughtered cow. In the 21st century, several research projects have worked on in vitro meat in the laboratory. 

The first in vitro beef burger, created by a Dutch team, was eaten at a demonstration for the press in London in August 2013. There remain difficulties to be overcome before in vitro meat becomes commercially available. Cultured meat is prohibitively expensive, but it is expected that the cost could be reduced to compete with that of conventionally obtained meat as technology improves.

In vitro meat is also an ethical issue. Some argue that it is less objectionable than traditionally obtained meat because it doesn’t involve killing and reduces the risk of animal cruelty, while others disagree with eating meat that has not developed naturally.

Nanotechnology

Nanotechnology (sometimes shortened to nanotech) is the manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest widespread description of nanotechnology referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology.

A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter that occur below the given size threshold.

Robotics

Robotics is the branch of technology that deals with the design, construction, operation, and application of robots, as well as computer systems for their control, sensory feedback, and information processing. These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition. A good example of a robot that resembles humans is Sophia, a social humanoid robot developed by Hong Kong-based company Hanson Robotics which was activated on April 19, 2015. Many of today’s robots are inspired by nature contributing to the field of bio-inspired robotics.

Stem cell therapy

Stem cell therapy is an intervention strategy that introduces new adult stem cells into damaged tissue in order to treat disease or injury. Many medical researchers believe that stem cell treatments have the potential to change the face of human disease and alleviate suffering. The ability of stem cells to self-renew and give rise to subsequent generations with variable degrees of differentiation capacities offers significant potential for generation of tissues that can potentially replace diseased and damaged areas in the body, with minimal risk of rejection and side effects.

Distributed ledger technology

Distributed ledger or blockchain technology is a technology which provides transparent and immutable lists of transactions. Blockchains can enable autonomous transactions through the use of smart contracts. Smart contracts are self-executing transactions which occur when pre-defined conditions are met.

The original idea of a smart contract was conceived by Nick Szabo in 1994, but these original theories about how these smart contracts could work remained unrealised because there was no technology to support programmable agreements and transactions between parties.

His example of a smart contract was the vending machine that holds goods until money has been received and then the goods are released to the buyer. The machine holds the property and is able to enforce the contract. There were two main issues that needed to be addressed before smart contracts could be used in the real world.

Firstly, the control of physical assets by smart contracts to be able to enforce agreements. Secondly, the last of trustworthy computers that are reliable and trusted to execute the contract between two or more parties. It is only with the advent of crypto currency and encryption that the technology for smart contracts has come to fruition.

Many potential applications of smart contracts have been suggested that go beyond the transfer of value from one party to another, such as supply chain management, electronic voting, law and the internet of things.

Medical Field Advancements

With technology being faster with delivering data with cloud computing, the medical field is taking advantage of this by creating digital health records. Since doctors recently created digital health records, this can greatly improve the efficiency the hospital can have with patients. Hospitals will improve public health by being able to share valuable information about an illness, make the workflow more smooth by doctors being able to pull up records on a patient with ease, and even lower healthcare costs by not using as much paper (Banova). With the advancement of cloud computing, information can be delivered faster for doctors to help the medical field grow.

Daily lives being changed

The daily lives of everyone are being positively impacted due to technology growing and advancing. There are several ways technology affects the lives of people. From checking the weather in an instant, to doctors being able to contact their patients with a push of a button, technology makes the day to day life easier. Other examples include parents being able to monitor their children of what they are doing, families streaming their entertaining through the internet, and parents being able to pay their bills online (Williams). Technology creates more convenience and helps the daily lives of everyone in the world.

Miscellaneous

There are also many other different inventions being made to help out in useful situations. One invention is a new type of airplane that is being created. This new airplane is being made out of polymer pieces and will allow the plane to change shapes during flight (Pappas).

A different invention will help police officers stop criminals and potentially dangerous situations. The invention shoots out restraints that stop the criminal in his or her tracks and prevents the criminal from doing anything else (Carbone). Other inventions will help with disaster relief. These inventions include a machine to purify water wherever water is needed, drones to help scout the area of a disaster, and a new vehicle by Honda to help out in different ways at an emergency. One way the vehicle helps is by aiding in wildfire relief (Osbun). Technology can help out in many different ways and in numerous different situations.

Development of emerging technologies

As innovation drives economic growth, and large economic rewards come from new inventions, a great deal of resources (funding and effort) go into the development of emerging technologies. Some of the sources of these resources are described below…

Research and development

Research and development is directed towards the advancement of technology in general, and therefore includes development of emerging technologies. 

 Applied research is a form of systematic inquiry involving the practical application of science. It accesses and uses some part of the research communities’ (the academia’s) accumulated theories, knowledge, methods, and techniques, for a specific, often state-, business-, or client-driven purpose.

Science policy is the area of public policy which is concerned with the policies that affect the conduct of the science and research enterprise, including the funding of science, often in pursuance of other national policy goals such as technological innovation to promote commercial product development, weapons development, health care and environmental monitoring.

DARPA

The Defense Advanced Research Projects Agency (DARPA) is an agency of the U.S. Department of Defense responsible for the development of emerging technologies for use by the military.

DARPA was created in 1958 as the Advanced Research Projects Agency (ARPA) by President Dwight D. Eisenhower. Its purpose was to formulate and execute research and development projects to expand the frontiers of technology and science, with the aim to reach beyond immediate military requirements.

Projects funded by DARPA have provided significant technologies that influenced many non-military fields, such as the Internet and Global Positioning System technology.

Technology competitions and awards

There are awards that provide incentive to push the limits of technology (generally synonymous with emerging technologies). Note that while some of these awards reward achievement after-the-fact via analysis of the merits of technological breakthroughs, others provide incentive via competitions for awards offered for goals yet to be achieved.

The Orteig Prize was a $25,000 award offered in 1919 by French hotelier Raymond Orteig for the first nonstop flight between New York City and Paris. In 1927, underdog Charles Lindbergh won the prize in a modified single-engine Ryan aircraft called the Spirit of St. Louis. In total, nine teams spent $400,000 in pursuit of the Orteig Prize.

The XPRIZE series of awards, public competitions designed and managed by the non-profit organization called the X Prize Foundation, are intended to encourage technological development that could benefit mankind. The most high-profile XPRIZE to date was the $10,000,000 Ansari XPRIZE relating to spacecraft development, which was awarded in 2004 for the development of SpaceShipOne.

The Turing Award is an annual prize given by the Association for Computing Machinery (ACM) to “an individual selected for contributions of a technical nature made to the computing community.” It is stipulated that the contributions should be of lasting and major technical importance to the computer field. The Turing Award is generally recognized as the highest distinction in computer science, and in 2014 grew to $1,000,000.

The Millennium Technology Prize is awarded once every two years by Technology Academy Finland, an independent fund established by Finnish industry and the Finnish state in partnership. The first recipient was Tim Berners-Lee, inventor of the World Wide Web.

In 2003, David Gobel seed-funded the Methuselah Mouse Prize (Mprize) to encourage the development of new life extension therapies in mice, which are genetically similar to humans. So far, three Mouse Prizes have been awarded: one for breaking longevity records to Dr. Andrzej Bartke of Southern Illinois University; one for late-onset rejuvenation strategies to Dr. Stephen Spindler of the University of California; and one to Dr. Z. Dave Sharp for his work with the pharmaceutical rapamycin.

Role of science fiction

Science fiction has criticized developing and future technologies, but also inspires innovation and new technology. This topic has been more often discussed in literary and sociological than in scientific forums. Cinema and media theorist Vivian Sobchack examines the dialogue between science fiction films and technological imagination. Technology impacts artists and how they portray their fictionalized subjects, but the fictional world gives back to science by broadening imagination. How William Shatner Changed the World is a documentary that gave a number of real-world examples of actualized technological imaginations. While more prevalent in the early years of science fiction with writers like Arthur C. Clarke, new authors still find ways to make currently impossible technologies seem closer to being realized.

THIS VIDEO EXPLAINS EMERGING TECHNOLOGIES

ICT industry

Information and communications technology (ICT) is an extensional term for information technology (IT) that stresses the role of unified communications and the integration of telecommunications (telephone lines and wireless signals) and computers, as well as necessary enterprise software, middleware, storage, and audiovisual systems, that enable users to access, store, transmit, and manipulate information.

The term ICT is also used to refer to the convergence of audiovisual and telephone networks with computer networks through a single cabling or link system. There are large economic incentives (huge cost savings due to the elimination of the telephone network) to merge the telephone network with the computer network system using a single unified system of cabling, signal distribution, and management.

ICT is a broad subject and the concepts are evolving. It covers any product that will store, retrieve, manipulate, transmit, or receive information electronically in a digital form (e.g., personal computers, digital television, email, or robots). For clarity, Zuppo provided an ICT hierarchy where all levels of the hierarchy “contain some degree of commonality in that they are related to technologies that facilitate the transfer of information and various types of electronically mediated communications”. Theoretical differences between interpersonal-communication technologies and mass-communication technologies have been identified by the philosopher Piyush Mathur. Skills Framework for the Information Ageis one of many models for describing and managing competencies for ICT professionals for the 21st century.

Etymology

The phrase “information and communication technologies” has been used by academic researchers since the 1980s. The abbreviation “ICT” became popular after it was used in a report to the UK government by Dennis Stevenson in 1997, and then in the revised National Curriculum for England, Wales and Northern Ireland in 2000. However, in 2012, the Royal Society recommended that the use of the term “ICT” should be discontinued in British schools “as it has attracted too many negative connotations”. From 2014 the National Curriculum has used the word computing, which reflects the addition of computer programming into the curriculum.

Variations of the phrase have spread worldwide. The United Nations has created a “United Nations Information and Communication Technologies Task Force” and an internal “Office of Information and Communications Technology”.

Monetization

The money spent on IT worldwide has been estimated as US$3.8 trillion  in 2017 and has been growing at less than 5% per year since 2009. The estimate 2018 growth of the entire ICT in is 5%. The biggest growth of 16% is expected in the area of new technologies (IoT, Robotics, AR/VR, and AI).

The 2014 IT budget of US federal government was nearly $82 billion. IT costs, as a percentage of corporate revenue, have grown 50% since 2002, putting a strain on IT budgets. When looking at current companies’ IT budgets, 75% are recurrent costs, used to “keep the lights on” in the IT department, and 25% are cost of new initiatives for technology development.

The average IT budget has the following breakdown:

  • 31% personnel costs (internal)
  • 29% software costs (external/purchasing category)
  • 26% hardware costs (external/purchasing category)
  • 14% costs of external service providers (external/services).

The estimate of money to be spent in 2022 is just over US$6 trillion.

Technological capacity

The world’s technological capacity to store information grew from 2.6 (optimally compressed) exabytes in 1986 to 15.8 in 1993, over 54.5 in 2000, and to 295 (optimally compressed) exabytes in 2007, and some 5 zettabytes in 2014. This is the informational equivalent to 1.25 stacks of CD-ROM from the earth to the moon in 2007, and the equivalent of 4,500 stacks of printed books from the earth to the sun in 2014.

The world’s technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986, 715 (optimally compressed) exabytes in 1993, 1.2 (optimally compressed) zettabytes in 2000, and 1.9 zettabytes in 2007. The world’s effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986, 471 petabytes in 1993, 2.2 (optimally compressed) exabytes in 2000, 65 (optimally compressed) exabytes in 2007 and some 100 exabytes in 2014. 

The world’s technological capacity to compute information with humanly guided general-purpose computers grew from 3.0 × 10^8 MIPS in 1986, to 6.4 x 10^12 MIPS in 2007.

ICT Development Index

The ICT Development Index ranks and compares the level of ICT use and access across the various countries around the world. In 2014 ITU (International Telecommunications Union) released the latest rankings of the IDI, with Denmark attaining the top spot, followed by South Korea. The top 30 countries in the rankings include most high-income countries where quality of life is higher than average, which includes countries from Europe and other regions such as “Australia, Bahrain, Canada, Japan, Macao (China), New Zealand, Singapore and the United States; almost all countries surveyed improved their IDI ranking this year.”

THIS VIDEO EXPLAINS INFORMATION TECHNOLOGY INDUSTRY

 

 

Welcome to FAWE

STEM Elearning

We at FAWE have built this platform to aid learners, trainers and mentors get practical help with content, an interactive platform and tools to power their teaching and learning of STEM subjects, more

How to find your voice as a woman in Africa

top
© FAWE, Powered by: Yaaka DN.
X