Pleased to be in the list of top 30 influencers for #IoT for 2017 along with Amazon Bosch Cisco Forrester and Gartner ..

Pleased to be in the list of top 30 influencers for IoT for 2017 along with Amazon Bosch Cisco  Forrester and Gartner ..

About 4 years ago, when I suggested to Oxford University that we should create a course on only the Algorithmic (#Datascience and #AI) aspects of Internet of Things .. I am grateful that they accepted the obscure(and complex!) idea creating the now industry recognised Data Science for Internet of Things course

Here, we work on complex and pioneering aspects of AI, Data Science and IoT (for instance systems engineering for AI/IoT).
Special thanks to Peter Holland and Adrian Stokes at Oxford University.
The list is created by Munich Re .. one of the largest reinsurance companies and Industrial IoT companies  in the world twitter feed @relayr_iot
Great to see IoT friends  Alexandra , Ronald Van Loon, Boris Adryan, Rob Van Kranenberg also on the list

Implementing Enterprise AI course using TensorFlow and Keras

Switch your career to Artificial Intelligence(AI) in 2018 through this unique and limited-edition course focused on AI for the Enterprise.


The Implementing Enterprise AI course covers

  • Design of Enterprise AI services
  • Technology foundations of Enterprise AI systems
  • Specific AI use cases
  • Development of AI services using TensorFlow and Keras
  • Deployment and Business models

We use TensorFlow and Keras. We also cover deployment models using Microservices, Docker and Kubernetes.


More details

  • The course targets developers and Architects who want to transition their career to Enterprise AI. The course correlates the new AI ideas with familiar concepts like ERP, Data warehousing etc and helps to make the transition easier. The course is based on a logical concept called an  ‘Enterprise AI layer’. This AI layer is focused on solving domain specific problems for an Enterprise.  We could see such a layer as an extension to the Data Warehouse or the ERP system (an Intelligent Data Warehouse/ Cognitive ERP system). Thus, the approach provides tangible and practical benefits for the Enterprise with a clear business model.
  • See link below for references from previous courses
  • Duration: Starting Jan 2018 Approximately six months (3 months for the content and up to three months for the Project)
  • Course includes a certificate of completion for projects (Projects will be created in a team and will use the code base provided in Keras)
  • The course covers the Enterprise AI Use Cases like insurance, Fraud detection, Anomaly detection, Churn, classification, Customer analytics etc
  • We also have a strategic(non-coding) based version of the course also
  • Delivery format is via video and online sessions (once every two weeks)
  • For pricing please contact us below

Please contact us to sign up or to know more [email protected]

Testimonials for our courses


 Jean Jacques Bernand – Paris – France

“Great course with many interactions, either group or one to one that helps in the learning. In addition, tailored curriculum to the need of each student and interaction with companies involved in this field makes it even more impactful.

As for myself, it allowed me to go into topics of interests that help me in reshaping my career.”


Johnny Johnson, AT&T – USA

“This DSIOT course is a great way to get up-to-speed.  The tools and methodologies for managing devices, wrangling and fusing data, and being able to explain it are taking form fast; Ajit Jaokar is a good fit.  For me, his patience and vision keep this busy corporate family man coming back.”


Yongkang Gao, General Electric, UK.

“I especially thank Ajit for his help on my personal project of the course — recommending proper tools and introducing mentors to me, which significantly reduced my pain in the beginning stage.”


karthik padmanabhan Manager – Global Data Insight and Analytics (GDIA) – Ford Motor Pvt Ltd.

“I am delighted to provide this testimonial to Ajit Jaokar who has extended outstanding support and guidance as my mentor during the entire program on Data science for IoT. Ajit is a world renowned professional in the niche area of applying the Data science principles in creating IoT apps. Talking about the program, it has a lot of breadth and depth covering some of the cutting edge topics in the industry such as Sensor Fusion, Deep Learning oriented towards the Internet of things domain. The topics such as Statistics, Machine Learning, IoT Platforms, Big Data and more speak about the complexity of the program. This is the first of its kind program in the world to provide Data Science training especially on the IoT domain and I feel fortunate to be part of the batch comprising of participants from different countries and skill sets. Overall this journey has transformed me into a mature and confident professional in this new space and I am grateful to Ajit and his team. My wish is to see this program accepted as a gold standard in the industry in the coming years”.


Peter Marriott – UK –

Attending the Data Science for IoT course has really helped me in demystifying the tools and practices behind machine learning and has allowed me to move from an awareness of machine learning to practical application.


Yair Meidan Israel –

“As a PhD student with an academic and practical experience in analytics, the DSIOT course is the perfect means by which I extend my expertise to the domain of IoT. It gradually elaborates on IoT concepts in general, and IoT analytics in particular. I recommend it to any person interested in entering that field. Thanks Ajit!”


Parinya Hiranpanthaporn, Data Architect and Advanced Analytics professional Bangkok

“Good content, Good instructor and Good networking. This course totally answers what I should know about Data Science for Internet of Things.”


Sibanjan Das – Bangalore

Ajit helped me to focus and set goals for my career that is extremely valuable. He stands by my side for every initiative I take and helps me to navigate me through every difficult situation I face. A true leader, a technology specialist, good friend and a great mentor. Cheers!!!


Manuel Betancurt – Mobile developer / Electronic Engineer. – Australia

I have had the opportunity to partake in the Data Science for the IoT course taught by Ajit Jaokar. He have crafted a collection of instructional videos, code samples, projects and social interaction with him and other students of this deep knowledge.

Ajit gives an awesome introduction and description of all the tools of the trade for a data scientist getting into the IoT. Even when I really come from a software engineering background, I have found the course totally accessible and useful. The support given by Ajit to make my IoT product a data science driven reality has been invaluable. Providing direction on how to achieve my data analysis goals and even helping me to publish the results of my investigation.

The knowledge demonstrated on this course in a mathematical and computer science level has been truly exciting and encouraging. This course was the key for me to connect the little data to the big data.


Barend Botha – London and South Africa –

This is a great course for anyone wanting to move from a development background into Data Science with specific focus on IoT. The course is unique in that it allows you to learn the theory, skills and technologies required while working on solving a specific problem of your choice, one that plays to your past strengths and interests. From my experience care is taken to give participants one to one guidance in their projects, and there is also within the course the opportunity to network and share interesting content and ideas in this growing field. Highly recommended!

- Barend Botha


Jamie Weisbrod – San Diego -

Currently there is a plethora of online courses and degrees available in data science/big data. What attracted me to joining the futuretext class “Data Science for ioT” is Ajit Jaokar. My main concern in choosing a course was how to leverage skills that I already possessed as a computer engineer. Ajit took the time to discuss how I could personalize the course for my interests.

I am currently in the midst of the basic coursework but already I have been able to network with students all over the world who are working on interesting projects. Ajit inspires a lot of people at all ages as he is also teaching young people Data science using space exploration.


 Robert Westwood – UK – Catalyst computing
“Ajit brings to the course years of experience in the industry and a great breadth of knowledge of the companies, people and research in the Data Science/IoT arena.”

Companies / Participants who have been part of the course


Tech: GE, HPE, Oracle, TCS, Wipro, HCL, HPE, Dell, Honeywell

Banking and Fintech : Goldman Sachs, ABN Amro, Nordea, Santander, BNP Paribas

Telecoms : Nokia, AT&T, Ericsson

Consulting : McKinsey, PA consulting

Automotive : Ford, Daimler, Jaguar

Retail : Coca Cola, Target

Airlines and Aircrafts : Boeing, Airbus

(Note : Above list includes participants from companies and also companies who have sponsored their personnel)

Participant Countries

We are pleased to have participants from all over the world – leading to a vibrant and a diverse learning ecosystem. A majority of our participants are from UK USA and India. But we also have participants from the following

North America: USA, Canada

Europe:  UK and Eastern Europe:  UK, Germany, France, Belgium, Poland, Russia, Norway, Italy, Finland, Ukraine, Austria, Ireland, Spain, Estonia, Sweden, Switzerland, Russia, Holland

Asia:  India, Japan, Thailand, Vietnam, Singapore

Middle East: UAE, Egypt, Iran

South America: Mexico, Brazil, Colombia, Nicaragua

Africa: South Africa, Zimbabwe

Australia and NZ: Australia



info at futuretext dot com

Timeline and Course outline

The course has three phases: Foundations, Development and Deployment. The projects are included in the deployment phase(in groups)

The Quiz is mostly in coding exercises (Unless you choose the strategic option)

Foundations (Jan – Feb)

  • The foundations of Data Science for the Enterprise with an emphasis on emerging fields like IoT and fintech.
  • A methodology for solving AI problems for the Enterprise
  • Foundations of Python
  • Tensorflow and Keras introduction
  • An end to end application (in code) for implementing Data Science for Enterprise and IoT
  • Understanding AI and Deep Learning

Development (March – April – May)

  • Machine learning implementation in detail
  • Understanding of Deep Learning concepts and implementation
  • Algorithms (ML and DL) : Multilayer Perceptron, Auto encoders, Deep Convolutional Networks, Recurrent Neural Networks, Reinforcement learning, Natural language processing
  • Implementations covering both Time series and Image
  • Unique considerations for Enterprise AI problems
  • Unique considerations for IoT (In this section, we consider the deployment models for IoT applications both for consumer and Industrial IoT – these include Edge, Complex event processing etc)
  • Considerations for specific industry verticals - insurance, Fraud detection, Anomaly detection, Churn, classification, Customer analytics etc

Deployment (June – July)

  •  A systems thinking approach for deploying complex applications
  • Understanding Docker, Microservices, Kubernetes and their role in the deployment cycle
  • Projects(in groups in a sprint / agile cycle)


info at futuretext dot com



workshop: A hands-on Introduction to Deep Learning using Keras and Tensorflow

On Saturday July 22 .. In London we have a very interesting workshop
A handson Introduction to Deep Learning using Keras and Tensorflow
by Ajit Jaokar and others

Both keras and tensorflow are hot technologies and they are a great way to get you started with Deep Learning

What you will learn First you learn the Theory and principles of Deep Learning(enough to understand Deep Learning code)
Then we will look at a tutorial in Keras and Tensorflow i.e. a handson implementation to build a basic neural network tutorial

What you need to know Python
What you need to bringA laptop with keras and tensorflow already installed

where?@fablab in Nottinghill

Barclay’s Eagle Lab
81 Palace Gardens Terrace
Notting Hill Gate
W8 4AT


This is free for participants of AI course (but not for the IoT course)

Else there is a small fee

Please email me at ajit.jaokar at if you are interested in attending

AI for Fintech course – Early discounts and limited places



The AI for fintech is a new course with limited places focused on AI design (product, development and Data) for the fintech industry.

We will first explain the end-to-end principles of AI and Deep Learning and then describe specific applications and the implications of deploying them in context of fintech

The course will be conducted by Ajit Jaokar and Jakob Aungiers 




•             Foundations of Enterprise AI

•             Understanding the application of AI for fintech

•             Introduction to TensorFlow and Keras

•             End to end implementation for an AI application


Designing an AI product

•             Basics of Designing an AI product

•             Understanding Deep learning

•             Machine learning algorithms in TensorFlow and Keras:

•             Designing with Deep Learning algorithms

Multilayer Perceptron

Auto encoders

Deep Convolutional Networks

Recurrent Neural Networks

Reinforcement learning

Natural language processing

Basics of Text Analytics


Deploying AI products for fintech

•             Methodology for Enterprise AI projects

•             Deploying Enterprise AI

•             Understanding the Enterprise AI layer

•             Acquiring Data and Training the Algorithm

•             Processing and hardware considerations

•             Business Models – High Performance Computing – Scaling and AI system

•             Costing an AI system

•             Creating a competitive advantage from AI

•             Specific considerations for fintech: ex EU payment directive (PSD2) etc


Course Logistics

The course targets designers or developers who work with fintech.

Strategic Option:  You can choose to work with the strategic option  (no coding)

Developer Option:  The full course based on development in TensorFlow and Keras

Duration: Starting July 31 2017 approximately six months

Offered Online  with video based content

Fees:         Please contact us

Contact :  [email protected]

AI for Smart Cities Lab launch

I have written about the AI for Smart cities lab before and we are close to launching it

Created by futuretext in collaboration with citysciences (UPM) and Nvidia  - the AI for Smart Cities lab explores complex problems in the deployment of AI for Smart cities

We have been working with the Nvidia Jetson product but are really looking forward to working with both Metropolis and Isaac for Smart cities

The lab will initially focus on Projects and events in London(based out of fablab), Berlin and Madrid.

Most of the initial development on the Nvidia platform will be using tensorflow

We will have an event in London on the week of June 12

More details of collaborators / team coming soon

Any questions – please email me on ajit.jaokar at

Re-thinking Enterprise business processes using Augmented Intelligence


In the 1990s, there was a popular book called Re-engineering the Corporation. Looking back now, Re-engineering certainly has had a mixed success – but it did have an impact over the last two decades. ERP deployments led by SAP and others were a direct result of the Business Process re-engineering phenomenon.

So, now, with the rise of AI: Could we think of a new form of Re-engineering the Corporation – using Artificial Intelligence? The current group of Robotic process automation companies focus on the UI layer. We could extend this far deeper into the Enterprise. Leaving aside the discussion of  the impact of AI on jobs, this could lead to augmented intelligence at the process level for employees (and hence an opportunity for people to transition their careers in the age of AI).

Here are some initial thoughts. I am exploring these ideas in more detail. This work is also a part of an AI lab we are launching in London and Berlin in partnership with UPM and Nvidia both for Enterprises and Cities

Re-thinking Enterprise business processes using Augmented Intelligence

How would you rethink Enterprise business processes using Augmented Intelligence?

To put the basics into perspective: we consider a very ‘grassroots’ meaning of AI. AI is based on Deep Learning. Deep Learning involves automatic feature detection using the data.  You could model a range of Data types (or combination thereof) using AI:

a)      Images and sound – Convolutional neural networks

b)      Transactional – ex Loan approval

c)       Sequences: including handwriting recognition via LSTMs and recurrent neural networks

d)      Text processing – ex natural language detection

e)      Behaviour understanding – via Reinforcement learning

To extend this idea to Process engineering for Enterprises and Cities, we need to

a)      Understand existing business processes

b)      Break the process down into its components

c)       Model the process using Data and Algorithms (both Deep Learning and Machine Learning)

d)      Improve the efficiency of the process by complementing the human activity with AI(Augmented intelligence)

But this just the first step: You would have to consider the wider impact of AI itself

So, here is my list / ‘stack’:

  • New processes due to disruption at the industry level (ex Uber)
  • Change of behaviour due to new processes( ex: employees collaborating with Robots as peers)
  • Improvements in current Business Processes for Enterprises: Customer services, Supply chain, Finance, Human resources, Project management, Corporate reporting, Sales and Logistics, Management
  • The GPU enabled enterprise  ex Nvidia Grid but more broadly GPUs Will Democratize Delivery of Modern Apps, More Efficient Hybridization of Workflows, Unify Compute and Graphics
  • The availability of bodies of labelled data
  • New forms of Communications: Text analytics, Natural language processing, Speech recognition, chatbots

I am exploring these ideas in more as part of my work on the Enterprise AI lab we are launching in London and Berlin in partnership with UPM and Nvidia both for Enterprises and Cities. Welcome your comments at ajit.jaokar at or @ajitjaokar



Young Data Scientist – more about forthcoming book/kickstarter

The role of a data scientist is one of the hottest jobs in the industry today – But how do we inspire the next generation of data scientists?

I have been working on the idea of Young Data Scientist for a years now – with various iterations and pivots

Its now ready to launch the next version as a book / kickstarter

The easiest way to inspire the next generation of Data Scientists is to go back to the basics – i.e. the Maths because Maths is the Universal language that underpins progress and innovation

This also aligns closely with my day job and my teaching – Data Science for Internet of Things at Oxford University

It allows me also to create the book /kickstarter using personal insights from my years of teaching

So, if you consider the maths foundations needed to learn Data Science, you could divide them into four key areas

Linear Algebra

Probability Theory and Statistics

Multidimensional Calculus


All of these are taught(in some shape or form) in high schools(13 to 17 years)

So, the book aims to build upon these foundations for a high school audience to inspire them to take up Data Science

The challenge here is to simplify and co-relate to existing maths knowledge considering the audience(13 to 17 year olds)

and most importantly to inspire!

It also would take them on a path to be Artificial Intelligence(AI) aware

Young Data Scientist will be a book, a kickstarter, a community

It will have Open source foundations

Young Data Scientist community will also work with teachers

And finally, the Young Data Scientist community will draw upon interesting examples in Space exploration, Genomics, Ecology etc

Coding will be in Python (including numpy and tensorflow sometimes)

Please email me at ajit.jaokar at

Best of my recent articles on KDnuggets and Data Science Central

I have been regularly featured on both kdnuggets and data science central.

Here is a list of my top articles recently

I discuss these ideas in the Implementing Enterprise AI course


  • Continuous improvement for IoT through AI / Continuous learning
     - 25 Nov 2016

    In reality, especially for IoT, it is not like once an analytics model is built, it will give the results with same accuracy till the end of time. Data pattern changes over the time which makes it absolutely important to learn from new data and improve/recalibrate the models to get correct result. Below article explain this phenomenon of continuous improvement in analytics for IoT.

  • Data Science for Internet of Things (IoT): Ten Differences From Traditional Data Science
     - 26 Sep 2016

    The connected devices (The Internet of Things) generate more than 2.5 quintillion bytes of data daily. All this data will significantly impact business processes and the Data Science for IoT will take increasingly central role. Here we outline 10 main differences between Data Science for IoT and traditional Data Science.
  • The Evolution of IoT Edge Analytics: Strategies of Leading Players - 02 Sep 2016
  • This article explores the significance and evolution of IoT edge analytics. Since the author believes that hardware capabilities will converge for large vendors, IoT analytics will be the key differentiator.

  • How to Become a (Type A) Data Scientist
     - 23 Aug 2016

    This post outlines the difference between a Type A and Type B data scientist, and prescribes a learning path on becoming a Type A.

Implementing Enterprise AI course – new batch – April 2017


The January batch of the Implementing AI course is completely sold out!

I am pleased to announce a new cohort for Implementing Enterprise AI course – starting April 24 2017. We are accepting places now. As usual, numbers are limited and we have an early bird discount

Implementing Enterprise AI is a unique and limited edition course that is focussed on AI Engineering / AI for the Enterprise.

Created in partnership with , the course uses Open Source technology to work with AI use cases. Successful participants will receive a certificate of completion and also validation of their project from

 To sign up or learn more, email [email protected]

The course covers

  • Design of Enterprise AI
  • Technology foundations of Enterprise AI systems
  • Specific AI use cases
  • Development of AI services
  • Deployment and Business models

 The course targets developers and Architects who want to transition their career to Enterprise AI. The course correlates the new AI ideas with familiar concepts like ERP, Data warehousing etc and helps to make the transition easier.

The implementation / development for the course is done using the H2O APIs for R, Python & Spark. 


Starting April 2017 approximately six months (3 months for the content and up to three months for the Project)
Course includes a certificate of completion and also validation of the project from (Projects will be created in a team)

 Course Logistics:

Offered Online  and Offline ( London and Berlin)

When:    April 2017
Duration: Approximately six months (including project)
Fees:      contact us

To sign up or learn more, email [email protected]


April – May 2017


  • Understanding the Enterprise AI layer
  • Introduction to Machine Learning
  • Unsupervised Learning
  • Supervised Learning
  • Generalized Linear Modeling
  • Gradient Boosting Machine
  • Ensembles
  • Random Forest
  • Programming foundations(see notes below)



  • Introduction to Deep Learning
  • Multilayer Perception
  • Auto encoders
  • Deep Convolutional Networks
  • Recurrent Neural Networks
  • Reinforcement learning
  • Programming foundations(see notes below)


July 2017


  • Natural language processing
  • Basics of Text Analytics
  • POS Tagging
  • Sentiment Analysis
  • Text Classification
  • Intelligent bots
  • Programming foundations(see notes below)


Aug – Oct 2017 – Projects and deployment


  • Deploying Enterprise AI
  • Acquiring Data and Training the Algorithm
  • Processing and hardware considerations
  • Business Models – High Performance Computing – Scaling and AI system
  • Costing an AI system
  • Creating a competitive advantage from AI
  • Industry Barriers for AI


Implementation of Enterprise AI use cases (in groups)


  • Healthcare
  • Insurance
  • Adtech
  • Fraud detection
  • Anomaly detection
  • Churn, classification
  • Customer analytics
  • Natural Language Processing, Bots and Virtual Assistants



  • The course covers Design of Enterprise AI, Technology foundations of Enterprise AI systems, Specific AI use cases, Development of AI services and Deployment and Business models
  • The implementation / development for the course is done using R, Python and Spark using the H2O APIs
  • For Deep learning, we also work with GPUs, tensoflow, Mxnet and Caffe
  • We focus on large scale problems
  • Notes on Programming foundations: We assume that you have significant Programming knowledge. However, we do not assume that you are familiar with Python, R or Spark.
  • The course provides you background in these languages over the first three months. You will then use this knowledge to work on the use cases in the Project phase Certification of completion is based on completion of quiz related to modules.
  • Project certification (validated by is based on Projects working in groups
  • Note that the syllabus is subject to change

Project certification by

Twelve types of Artificial Intelligence (AI) problems

Background – How many cats does it take to identify a


In this article, I cover the 12 types of AI problems i.e. I address the question : in which scenarios should you use Artificial Intelligence (AI)?  We cover this space in the  Enterprise AI course

Some background:

Recently, I conducted a strategy workshop for a group of senior executives running a large multi national. In the workshop, one person asked the question: How many cats does it need to identify a Cat?

This question is in reference to Andrew Ng’s famous paper on Deep Learning where he was correctly able to identify images of Cats from YouTube videos. On one level, the answer is very clear: because Andrew Ng lists that number in his paper. That number is 10 million images .. But the answer is incomplete because the question itself is limiting since there are a lot more details in the implementation – for example training on a cluster with 1,000 machines (16,000 cores) for three days. I wanted to present a more detailed response to the question. Also, many problems can be solved using traditional Machine Learning algorithms – as per an excellent post from Brandon Rohrer – which algorithm family can answer my question. So, in this post I discuss problems that can be uniquely addressed through AI. This is not an exact taxonomy but I believe it is comprehensive. I have intentionally emphasized Enterprise AI problems because I believe AI will affect many mainstream applications – although a lot of media attention goes to the more esoteric applications.


What problem does Deep Learning address?

What is Deep Learning?

Firstly, let us explore what is Deep Learning


Deep learning refers to artificial neural networks that are composed of many layers. The ‘Deep’ refers to multiple layers. In contrast, many other machine learning algorithms like SVM are shallow because they do not have a Deep architecture through multiple layers. The Deep architecture allows subsequent computations to build upon previous ones. We currently have deep learning networks with 10+ and even 100+ layers.


The presence of multiple layers allows the network to learn more abstract features. Thus, the higher layers of the network can learn more abstract features building on the inputs from the lower layers.  A Deep Learning network can be seen as a Feature extraction layer with a Classification layer on top. The power of deep learning is not in its classification skills, but rather in its feature extraction skills. Feature extraction is automatic (without human intervention) and multi-layered.


The network is trained by exposing it to a large number of labelled examples. Errors are detected and the weights of the connections between the neurons adjusted to improve results. The optimisation process is repeated to create a tuned network. Once deployed, unlabelled images can be assessed based on the tuned network.


Feature engineering involves finding connections between variables and packaging them into a new single variable is called. Deep Learning performs automated feature engineering. Automated feature engineering is the defining characteristic of Deep Learning especially for unstructured data such as images. This matters because the alternative is engineering features by hand. This is slow, cumbersome and depends on the  domain knowledge of the people/person performing the Engineering


Deep Learning suits problems where the target function is complex and datasets are large but with examples of positive and negative cases.  Deep Learning also suits problems that involve Hierarchy and Abstraction.

Abstraction is a conceptual process by which general rules and concepts are derived from the usage and classification of specific examples. We can think of an abstraction as the creation of a ‘super-category’ which comprises of the common features that describe the examples for a specific purpose but ignores the ‘local changes’ in each example.  For example, the abstraction of a ‘Cat’ would comprise fur, whiskers etc. For Deep Learning, each layer is involved with detection of one characteristic and subsequent layers build upon previous ones.  Hence, Deep Learning is used in situations where the problem domain comprises abstract and hierarchical concepts. Image recognition falls in this category. In contrast, a Spam detection problem that can be modelled neatly as a spreadsheet probably is not a complex problem to warrant Deep Learning

A more detailed explanation of this question can be found in THIS Quora thread.

AI vs. Deep Learning vs. Machine Learning

Before we explore types of AI applications, we need to also discuss the differences between the three terms AI vs. Deep Learning vs. Machine Learning.

The term Artificial Intelligence (AI) implies a machine that can Reason. A more complete list or AI characteristics (source David Kelnar) is

  1. Reasoning: the ability to solve problems through logical deduction
  2. Knowledge: the ability to represent knowledge about the world (the understanding that there are certain entities, events and situations in the world; those elements have properties; and those elements can be categorised.)
  3. Planningthe ability to set and achieve goals (there is a specific future state of the world that is desirable, and sequences of actions can be undertaken that will effect progress towards it)
  4. Communication: the ability to understand written and spoken language.
  5. Perception: the ability to deduce things about the world from visual images, sounds and other sensory inputs.


The holy grail of AI is artificial general intelligence (aka like Terminator!) that allows machines to function independently in a normal human environment. What we see today is mostly narrow AI (ex like the NEST thermostat). AI is evolving rapidly. A range of technologies drive AI currently. These include: image recognition and auto labelling, facial recognition, text to speech, speech to text, auto translation, sentiment analysis, and emotion analytics in image, video, text, and speech. Source: Bill Vorhies  AI Apps  have also reached accuracies of 99% in contrast to 95% just a few years back.


Improvements in Deep Learning algorithms drive AI.  Deep Learning algorithms can detect patterns without the prior definition of features or characteristics. They can be seen as a hybrid form of supervised learning because you must still train the network with a large number of examples but without the requirement for predefining the characteristics of the examples (features). Deep Learning networks have made vast improvements both due to the algorithms themselves but also due to better hardware(specifically GPUs)


Finally, in a broad sense, the term Machine Learning means the application of any algorithm that can be applied against a dataset to find a pattern in the data. This includes algorithms like supervised, unsupervised, segmentation, classification, or regression. Despite their popularity, there are many reasons why Deep learning algorithms will not make other Machine Learning algor…

12 types of AI problems

With this background, we now discuss the twelve types of AI problems.

1) Domain expert: Problems which involve Reasoning based on a complex body of knowledge

This includes tasks which are based on learning a body of knowledge like Legal, financial etc. and then formulating a process where the machine can simulate an expert in the field

2) Domain extension: Problems which involve extending a complex body of Knowledge

Here, the machine learns a complex body of knowledge like information about existing medication etc. and then can suggest new insights to the domain itself – for example new drugs to cure diseases.

3) Complex Planner: Tasks which involve Planning

Many logistics and scheduling tasks can be done by current (non AI) algorithms. But increasingly, as the optimization becomes complex AI could help. One example is the use of AI techniques in IoT for Sparse datasets  AI techniques help on this case because we have large and complex datasets where human beings cannot detect patterns but a machine can do so easily.

4) Better communicator: Tasks which involve improving existing communication

AI and Deep Learning benefit many communication modes such as automatic translation,  intelligent agents etc

5) New Perception: Tasks which involve Perception

AI and Deep Learning  enable newer forms of Perception which enables new services such as autonomous vehicles

6) Enterprise AI: AI meets Re-engineering the corporation!

While autonomous vehicles etc get a lot of media attention, AI will be deployed in almost all sectors of the economy. In each case, the same principles apply i.e. AI will be used to create new insights from automatic feature detection via Deep Learning – which in turn help to optimize, improve or change a business process (over and above what can be done with traditional machine learning). I outlined some of these processes in financial services in a previous blog: Enterprise AI insights from the AI Europe event in London. In a wider sense, you could view this as Re-engineering the Corporation meets AI/ Artificial Intelligence. This is very much part of the Enterprise AI course


7) Enterprise AI adding unstructured data and Cognitive capabilities to ERP and Datawarehousing

For reasons listed above, unstructured data offers a huge opportunity for Deep Learning and hence AI.  As per Bernard Marr writing in Forbes:  “The vast majority of the data available to most organizations is unstructured – call logs, emails, transcripts, video and audio data which, while full of valuable insights, can’t easily be universally formatted into rows and columns to make quantitative analysis straightforward. With advances in fields such as image recognition, sentiment analysis and natural language processing, this information is starting to give up its secrets, and mining it will become increasingly big business in 2017.” I very much agree to this. In practise, this will mean enhancing the features of ERP and Datawarehousing systems through Cognitive systems.

8) Problems which impact domains due to second order consequences of AI

David Kelnar says in The fourth industrial revolution a primer on artificial intelligenc…

“The second-order consequences of machine learning will exceed its immediate impact. Deep learning has improved computer vision, for example, to the point that autonomous vehicles (cars and trucks) are viable. But what will be their impact? Today, 90% of people and 80% of freight are transported via road in the UK. Autonomous vehicles alone will impact: safety (90% of accidents are caused by driver inattention) employment (2.2 million people work in the UK haulage and logistics industry, receiving an estimated £57B in annual salaries) insurance (Autonomous Research anticipates a 63% fall in UK car insurance premiums over time) sector economics (consumers are likely to use on-demand transportation services in place of car ownership); vehicle throughput; urban planning; regulation and more. “


9) Problems in the near future that could benefit from improved algorithms

A catch-all category for things which were not possible in the past, could be possible in the near future due to better algorithms or better hardware.  For example, in Speech recognition, improvements continue to be made and currently, the abilities of the machine equal that of a human. From 2012, Google used LSTMs to power the speech recognition system in Android. Just six weeks ago, Microsoft engineers reported that their system reached a word error rate of 5.9% — a figure roughly equal to that of human abilities for the first time in history.  The goal-post continues to be moved rapidly .. for example is building an avatar that can capture your personality

10) Evolution of Expert systems

Expert systems have been around for a long time.  Much of the vision of Expert systems could be implemented in AI/Deep Learning algorithms in the near future. If you study the architecture of IBM Watson, you can see that the Watson strategy leads to an Expert system vision. Of course, the same ideas can be implemented independently of Watson today.


11) Super Long sequence pattern recognition

This domain is of personal interest to me due to my background with IoT see my course at Oxford University Data Science for Internet of Things. I got this title from a slide from Uber’s head of Deep Learning who I met at the AI Europe event in London. The application of AI techniques to sequential pattern recognition is still an early stage domain(and does not yet get the kind of attention as CNNs for example) – but in my view, this will be a rapidly expanding space. For some background see this thesis from Technische Universitat Munchen (TUM) Deep Learning For Sequential P…  and also this blog by Jakob Aungiers   LSTM Neural Network for Time Series Prediction


12) Extending Sentiment Analysis using AI

The interplay between AI and Sentiment analysis is also a new area. There are already many synergies between AI and Sentiment analysis because many functions of AI apps need sentiment analysis features.

“The common interest areas where Artificial Intelligence (AI) meets sentiment analysis can be viewed from four aspects of the problem and the aspects can be grouped as Object identification, Feature extraction, Orientation classification and Integration. The existing reported solutions or available systems are still far from being perfect or fail to meet the satisfaction level of the end users. The main issue may be that there are many conceptual rules that govern sentiment and there are even more clues (possibly unlimited) that can convey these concepts from realization to verbalization of a human being.” source: SAAIP

Notes: the post The fourth industrial revolution a primer on artificial intelligenc…  also offers a good insight on AI domains also see #AI application areas – a paper review of AI applications (pdf)



To conclude, AI is a rapidly evolving space. Although AI is more than Deep Learning, Advances in Deep Learning drive AI. Automatic feature learning is the key feature of AI. AI needs many detailed and pragmatic strategies which I have not yet covered here. A good AI Designer should be able to suggest more complex strategies like Pre-training or AI Transfer Learning

AI is not a panacea. AI comes with a cost (skills, development, and architecture) but provides an exponential increase in performance. Hence, AI is ultimately a rich company’s game. But AI is also a ‘winner takes all’ game and hence provides a competitive advantage. The winners in AI will take an exponential view addressing very large scale problems i.e. what is possible with AI which is not possible now?

We cover this space in the  Enterprise AI course