Emerging Technology Trends
Emerging
Technology Trends
Emerging technologies are technologies that are
perceived as capable of changing the status quo. These technologies are
generally new but include older technologies that are still controversial and
relatively undeveloped in potential, such as preimplantation genetic diagnosis and gene therapy which
date to 1989 and 1990 respectively.
Emerging technologies are characterized by radical novelty,
relatively fast growth, coherence, prominent impact, and uncertainty and
ambiguity. In other words, an emerging technology can be defined as "a
radically novel and relatively fast growing technology characterised by a
certain degree of coherence persisting over time and with the potential to
exert a considerable impact on the socio-economic domain(s) which is observed
in terms of the composition of actors, institutions and patterns of
interactions among those, along with the associated knowledge production
processes. Its most prominent impact, however, lies in the future and so in the
emergence phase is still somewhat uncertain and ambiguous.".
Emerging technologies include a variety of technologies such
as educational technology, information technology, nanotechnology, biotechnology, cognitive
science, psychotechnology, robotics,
and artificial intelligence.
New technological fields may result from the technological convergence of
different systems evolving towards similar goals. Convergence brings previously
separate technologies such as voice (and telephony features), data (and
productivity applications) and video together so that they share resources and
interact with each other, creating new efficiencies.
Emerging technologies are those technical innovations which
represent progressive developments within a field for competitive advantage; converging
technologies represent previously distinct fields which are in some way moving
towards stronger inter-connection and similar goals. However, the opinion on
the degree of the impact, status and economic viability of several emerging and
converging technologies.
Key Trends
1. current trends
2. emerging trends
Current
trends
1. APIs
In computer
programming, an application
programming interface (API) is a set of subroutine
definitions, protocols, and tools for building software. In general
terms, it is a set of clearly defined methods of communication between
various components. A good API makes it easier to develop a computer program by providing all the building blocks,
which are then put together by the programmer. An API may be for a web-based system, operating system, database system, computer hardware, or software
library. An API specification can
take many forms, but often includes specifications for routines, data
structures, object
classes, variables, or remote calls. POSIX, Windows API and ASPIare examples
of different forms of APIs. Documentation for the API is usually provided to
facilitate usage and implementation.
2. AI
Artificial intelligence (AI), sometimes
called machine intelligence, is intelligencedemonstrated
by machines,
in contrast to the natural intelligence displayed by humans
and other animals. In computer science AI
research is defined as the study of "intelligent
agents": any device that perceives its environment and takes
actions that maximize its chance of successfully achieving its
goals. Colloquially, the term "artificial intelligence" is
applied when a machine mimics "cognitive" functions that humans
associate with other human minds, such as "learning" and "problem
solving".
The scope of AI is disputed: as machines become increasingly
capable, tasks considered as requiring "intelligence" are often
removed from the definition, a phenomenon known as the AI effect,
leading to the quip, "AI is whatever hasn't been done yet." For
instance, optical character recognition is
frequently excluded from "artificial intelligence", having become a
routine technology. Capabilities generally classified as AI as of
2017 include successfully understanding human speech, competing at
the highest level in strategic game systems (such as chess and Go), autonomous cars,
intelligent routing in content delivery network and military simulations.
Artificial intelligence was founded as an academic discipline in
1956, and in the years since has experienced several waves of optimism,
followed by disappointment and the loss of funding (known as an "AI winter"), followed
by new approaches, success and renewed funding. For most of its history,
AI research has been divided into subfields that often fail to communicate with
each other. These sub-fields are based on technical considerations, such
as particular goals (e.g. "robotics" or "machine
learning"), the use of particular tools ("logic" or artificial neural networks), or deep
philosophical differences. Subfields have also been based on social
factors (particular institutions or the work of particular researchers).
The traditional problems (or goals) of AI research include reasoning, knowledge representation, planning, learning, natural language processing, perception and the ability to move and
manipulate objects. General intelligence is among the
field's long-term goals. Approaches include statistical
methods, computational
intelligence, and traditional
symbolic AI. Many tools are used in AI, including versions of search and
mathematical optimization, artificial neural networks, and methods based on
statistics, probability and economics. The AI field draws upon computer science, mathematics, psychology, linguistics, philosophy and
many others.
The field was founded on the claim that human intelligence "can be so
precisely described that a machine can be made to simulate it". This
raises philosophical arguments about the nature of the mind and the ethics
of creating artificial beings endowed with human-like intelligence which are
issues that have been explored by myth, fiction and philosophy since antiquity. Some
people also consider AI to be a danger to humanity if it progresses
unabatedly. Others believe that AI, unlike previous technological
revolutions, will create a risk of mass unemployment.
In the twenty-first century, AI techniques have experienced a
resurgence following concurrent advances in computer power, large amounts of data,
and theoretical understanding; and AI techniques have become an essential part
of the technology industry, helping to solve many
challenging problems in computer science.
3. IOT
The Internet of Things (IoT) is the
network of physical devices, vehicles, home appliances, and other items
embedded with electronics, software, sensors, actuators,
and connectivity which enables these things to
connect and exchange data, creating
opportunities for more direct integration of the physical world into
computer-based systems, resulting in efficiency improvements, economic
benefits, and reduced human exertions.
The number of IoT devices increased 31% year-over-year to 8.4
billion in 2017 and it is estimated that there will be 30 billion devices
by 2020. The global market value of IoT is projected to reach $7.1
trillion by 2020.
IoT involves extending internet connectivity beyond standard
devices, such as desktops, laptops, smartphones and tablets, to any range of
traditionally dumbor non-internet-enabled physical devices and
everyday objects. Embedded with technology, these devices can communicate and
interact over the internet, and they can be remotely monitored and controlled.
4.
BOTS
An Internet Bot, also known as web robot, WWW
robot or simply -bot-, is a software application that runs automated
tasks (scripts) over the Internet. Typically, bots perform tasks that are both
simple and structurally repetitive, at a much higher rate than would be
possible for a human alone. The largest use of bots is in web spidering (web
crawler), in which an automated script fetches, analyzes and files
information from web servers at many times the speed of a human. More than half
of all web traffic is made up of bots.
Efforts by servers hosting websites to counteract bots vary.
Servers may choose to outline rules on the behaviour of internet bots by
implementing a
robots.txt file: this file is simply
text stating the rules governing a bot's behaviour on that server. Any bot
interacting with (or 'spidering') any server that does not follow these rules
should, in theory, be denied access to, or removed from, the affected website.
If the only rule implementation by a server is a posted text file with no
associated program/software/app, then adhering to those rules is entirely
voluntary – in reality there is no way to enforce those rules, or even to
ensure that a bot's creator or implementer acknowledges, or even reads, the
robots.txt file contents. Some bots are "good" – e.g. search engine
spiders – while others can be used to launch malicious and harsh attacks, most
notably, in political campaigns.
Emerging
Trends
1. Blockchain
A blockchain, originally block chain, is
a continuously growing list of records, called blocks, which
are linked and secured using cryptography. Each
block contains a cryptographic hash of the previous
block, a timestamp, and transaction data (generally
represented as a merkle tree root hash). By design, a blockchain is
resistant to modification of the data. It is "an open, distributed ledger that can record
transactions between two parties efficiently and in a verifiable and permanent
way". For use as a distributed ledger,
a blockchain is typically managed by a peer-to-peer network
collectively adhering to a protocol for inter-node communication
and validating new blocks. Once recorded, the data in any given block cannot be
altered retroactively without alteration of all subsequent blocks, which
requires consensus of the network majority.
Blockchains are secure by design and
exemplify a distributed computing system with high Byzantine fault tolerance. Decentralized consensus
has therefore been achieved with a blockchain.
Blockchain was invented by Satoshi Nakamoto in
2008 to serve as the public transaction ledger of
the cryptocurrency bitcoin. The
invention of the blockchain for bitcoin made it the first digital currency to
solve the double-spending problem without the need
of a trusted authority or central server. The bitcoin design has inspired other
applications.
2. Sharing
Economy
Sharing economy is an umbrella term with a range of
meanings, often used to describe economic activity involving online transactions.Originally
growing out of the open-source community to refer to peer-to-peer based sharing of access to goods and
services, the term is now sometimes used in a broader sense to describe
any sales transactions that are done via online market places, even ones that are business to business (B2B), rather than
peer-to-peer. For this reason, the term sharing economy has
been criticised as misleading, some arguing that even services that enable
peer-to-peer exchange can be primarily profit-driven. However, many
commentators assert that the term is still valid as a means of describing a
generally more democratized marketplace, even when it's applied to a broader
spectrum of services. Alternatively, collaborative consumption or the sharing
economy refers rather to resource circulation systems which allow a consumer
two-sided role, in which consumers may act as both providers of resources or
obtainers of resources. This vision allows for a broader understanding of
the sharing economy on the overarching criteria of consumer changing role
capacity.
3. AR
Augmented Reality (AR) is an
interactive experience of a real-world environment whose elements are
"augmented" by computer-generated perceptual information, sometimes
across multiple sensory modalities, including visual, auditory, haptic, somatosensory, and olfactory. The
overlaid sensory information can be constructive (i.e. additive to the natural
environment) or destructive (i.e. masking of the natural environment) and is
seamlessly interwoven with the physical world such that it is perceived as
an immersive aspect of the real
environment. In this way, augmented reality alters one’s ongoing perception of
a real world environment, whereas virtual reality completely
replaces the user's real world environment with a simulated one. Augmented
reality is related to two largely synonymous terms: mixed reality and computer-mediated reality.
The primary value of augmented reality is that it brings
components of the digital world into a person's perception of the real world,
and does so not as a simple display of data, but through the integration of
immersive sensations that are perceived as natural parts of an environment. The
first functional AR systems that provided immersive mixed reality experiences
for users were invented in the early 1990s, starting with the Virtual Fixtures system
developed at the U.S. Air Force's Armstrong Laboratory in 1992. The
first commercial augmented reality experiences were used largely in the
entertainment and gaming businesses, but now other industries are also getting
interested about AR's possibilities for example in knowledge sharing,
educating, managing the information flood and organizing distant meetings.
Augmented reality is also transforming the world of education, where content
may be accessed by scanning or viewing an image with a mobile
device. Another example is an AR helmet for construction workers which
display information about the construction sites.
4. Quantum
Computing
Quantum computing is computing using quantum-mechanical phenomena,
such as superposition and entanglement. A quantum computer is
a device that performs quantum computing. They are different from binary digital electronic computers based
on transistors.
Whereas common digital computing requires that the data be encoded into binary
digits (bits),
each of which is always in one of two definite states (0 or 1), quantum
computation uses quantum bits, which can be in superpositions of states. A quantum Turing machine is a theoretical
model of such a computer, and is also known as the universal quantum computer.
The field of quantum computing was initiated by the work of Paul Benioff and Yuri Manin in
1980, Richard Feynman in 1982, and David Deutsch in
1985.
As of 2018, the development of actual quantum computers is
still in its infancy, but experiments have been carried out in which quantum
computational operations were executed on a very small number of quantum
bits. Both practical and theoretical research continues, and many national
governments and military agencies are funding quantum computing research in
additional effort to develop quantum computers for
civilian, business, trade, environmental and national security purposes, such
as cryptanalysis. A small 20-qubit quantum computer exists
and is available for experiments via the IBM quantum experience project. D-Wave Systems has
been developing their own version of a quantum computer that uses annealing.
5.
3D Printing
3D printing is any of various processes in which
material is joined or solidified under computer control to create a three-dimensional object, with
material being added together (such as liquid molecules or powder grains being
fused together). 3D printing is used in both rapid
prototyping and additive manufacturing (AM).
Objects can be of almost any shape or geometry and typically are produced using
digital model data from a 3D model or
another electronic data source such as an Additive Manufacturing File (AMF)
file (usually in sequential layers). There are many different technologies, like stereolithography (SLA)
or fused deposit modeling (FDM). Thus,
unlike material removed from a stock in the conventional machining process, 3D
printing or AM builds a three-dimensional object from computer-aided design
(CAD) model or AMF file, usually by successively adding material layer by layer.
The
term "3D printing" originally referred to a process that deposits
a binder material onto a powder bed
with inkjet printer heads layer by layer. More
recently, the term is being used in popular vernacular to encompass a wider
variety of additive manufacturing techniques. United States and global technical standards use the official
term additive manufacturing for this broader sense.








Nice work
ReplyDelete❤❤
DeleteGreat job
ReplyDelete❤❤
DeleteGood job
ReplyDelete❤❤
DeleteGood one bro...
ReplyDeletenice work bro
ReplyDelete❤❤
DeleteNice
ReplyDelete❤❤
DeleteNice work..
ReplyDelete❤❤
DeleteGood work
ReplyDelete❤❤
Deletegood job
ReplyDelete❤❤
Delete❤❤
ReplyDeleteGood job
ReplyDeleteNice work
ReplyDelete