agilePHM – a new open source product for rapid prototyping of PHM analytics


We are launching a new product called agilePHM

In Industrial IoT, I have been working with PHM  (Prognostics and Health Management) for a while and it is a well known discipline

Prognostics and Health management(PHM)  is an engineering discipline focused on predicting the time at which a system or a component will no longer perform its intended function. This lack of performance is most often a failure beyond which the system can no longer be used to meet desired performance. The predicted time then becomes the remaining useful life (RUL), which is an important concept in decision making for contingency mitigation. Prognostics predicts the future performance of a component by assessing the extent of deviation or degradation of a system from its expected normal operating conditions.The science of prognostics is based on the analysis of failure modes, detection of early signs of wear and aging, and fault conditions. An effective prognostics solution is implemented when there is sound knowledge of the failure mechanisms that are likely to cause the degradations leading to eventual failures in the system. (source wikiedia)


PHM applies to a range of domains like defence, shipping, industrial applications etc

We are developing a product for rapid prototyping of the analytics component of PHM

agiilePHM is designed for for rapid prototyping of PHM applications (implemented on standard hardware)

The need for the product

The idea of agilePHM arose due to a few observations

a)  There is a need for rapid prototyping of new ideas(from an analytics standpoint) as an exclusive function: Industrial IoT is a very new space and evolving. Ideas from different domains cross-pollinate and there is a need to quickly test out concepts(either products or processes)

b)   Data Science skills shortage: Data science skills are expensive and are often focused on industries like Banking(in contrast to Industrial IoT). So, think of agilePHM like ‘Data Scientist in a  box’ for the Industrial IoT space

c)  larger products have a much heavier footprint: Our customer is someone who wants to rapidly prototype the model (without knowing the algorithms in detail). Larger products have a much heavier footprint. Many seem like installing ERP in the old days! They perform the function of rapid prototyping as a small component (as opposed to exclusive emphasis on it)

d)  Flexibility: The approach complements existing approaches like Physics based modelling

e)  Why open source ..  Our main strength lies in IoT analytics (Ajit Jaokar teaches a course on Data Science for Internet of Things at the University of Oxford). However, the problem we address is complex because there are many processes (and many machines!) to abstract algorithmically. This needs some form of open source.


agilePHM has three components

1) Digital Twin

2) Rapid prototyping

3) Workflow – Process engineering

agilePHM will have the following deployment models
On premise with support

Open source

Kaggle like contest community engagement

It also allows students in our course to gain real life / practical experience


a)   If you are a company interested in working with us, please email me on ajit.jaokar at

b) If you are interested in gaining real experience in AI .. you can work on the product with companies as part of our course. Please contact info at to know more

Pleased to be in the list of top 30 influencers for #IoT for 2017 along with Amazon Bosch Cisco Forrester and Gartner ..

Pleased to be in the list of top 30 influencers for IoT for 2017 along with Amazon Bosch Cisco  Forrester and Gartner ..

About 4 years ago, when I suggested to Oxford University that we should create a course on only the Algorithmic (#Datascience and #AI) aspects of Internet of Things .. I am grateful that they accepted the obscure(and complex!) idea creating the now industry recognised Data Science for Internet of Things course

Here, we work on complex and pioneering aspects of AI, Data Science and IoT (for instance systems engineering for AI/IoT).
Special thanks to Peter Holland and Adrian Stokes at Oxford University.
The list is created by Munich Re .. one of the largest reinsurance companies and Industrial IoT companies  in the world twitter feed @relayr_iot
Great to see IoT friends  Alexandra , Ronald Van Loon, Boris Adryan, Rob Van Kranenberg also on the list

Implementing Enterprise AI course using TensorFlow and Keras

Switch your career to Artificial Intelligence(AI) in 2018 through this unique and limited-edition course focused on AI for the Enterprise.


The Implementing Enterprise AI course covers

  • Design of Enterprise AI services
  • Technology foundations of Enterprise AI systems
  • Specific AI use cases
  • Development of AI services using TensorFlow and Keras
  • Deployment and Business models

We use TensorFlow and Keras. We also cover deployment models using Microservices, Docker and Kubernetes.


More details

  • The course targets developers and Architects who want to transition their career to Enterprise AI. The course correlates the new AI ideas with familiar concepts like ERP, Data warehousing etc and helps to make the transition easier. The course is based on a logical concept called an  ‘Enterprise AI layer’. This AI layer is focused on solving domain specific problems for an Enterprise.  We could see such a layer as an extension to the Data Warehouse or the ERP system (an Intelligent Data Warehouse/ Cognitive ERP system). Thus, the approach provides tangible and practical benefits for the Enterprise with a clear business model.
  • See link below for references from previous courses
  • Duration: Starting Jan 2018 Approximately six months (3 months for the content and up to three months for the Project)
  • Course includes a certificate of completion for projects (Projects will be created in a team and will use the code base provided in Keras)
  • The course covers the Enterprise AI Use Cases like insurance, Fraud detection, Anomaly detection, Churn, classification, Customer analytics etc
  • We also have a strategic(non-coding) based version of the course also
  • Delivery format is via video and online sessions (once every two weeks)
  • For pricing please contact us below

Please contact us to sign up or to know more [email protected]

Testimonials for our courses


 Jean Jacques Bernand – Paris – France

“Great course with many interactions, either group or one to one that helps in the learning. In addition, tailored curriculum to the need of each student and interaction with companies involved in this field makes it even more impactful.

As for myself, it allowed me to go into topics of interests that help me in reshaping my career.”


Johnny Johnson, AT&T – USA

“This DSIOT course is a great way to get up-to-speed.  The tools and methodologies for managing devices, wrangling and fusing data, and being able to explain it are taking form fast; Ajit Jaokar is a good fit.  For me, his patience and vision keep this busy corporate family man coming back.”


Yongkang Gao, General Electric, UK.

“I especially thank Ajit for his help on my personal project of the course — recommending proper tools and introducing mentors to me, which significantly reduced my pain in the beginning stage.”


karthik padmanabhan Manager – Global Data Insight and Analytics (GDIA) – Ford Motor Pvt Ltd.

“I am delighted to provide this testimonial to Ajit Jaokar who has extended outstanding support and guidance as my mentor during the entire program on Data science for IoT. Ajit is a world renowned professional in the niche area of applying the Data science principles in creating IoT apps. Talking about the program, it has a lot of breadth and depth covering some of the cutting edge topics in the industry such as Sensor Fusion, Deep Learning oriented towards the Internet of things domain. The topics such as Statistics, Machine Learning, IoT Platforms, Big Data and more speak about the complexity of the program. This is the first of its kind program in the world to provide Data Science training especially on the IoT domain and I feel fortunate to be part of the batch comprising of participants from different countries and skill sets. Overall this journey has transformed me into a mature and confident professional in this new space and I am grateful to Ajit and his team. My wish is to see this program accepted as a gold standard in the industry in the coming years”.


Peter Marriott – UK –

Attending the Data Science for IoT course has really helped me in demystifying the tools and practices behind machine learning and has allowed me to move from an awareness of machine learning to practical application.


Yair Meidan Israel –

“As a PhD student with an academic and practical experience in analytics, the DSIOT course is the perfect means by which I extend my expertise to the domain of IoT. It gradually elaborates on IoT concepts in general, and IoT analytics in particular. I recommend it to any person interested in entering that field. Thanks Ajit!”


Parinya Hiranpanthaporn, Data Architect and Advanced Analytics professional Bangkok

“Good content, Good instructor and Good networking. This course totally answers what I should know about Data Science for Internet of Things.”


Sibanjan Das – Bangalore

Ajit helped me to focus and set goals for my career that is extremely valuable. He stands by my side for every initiative I take and helps me to navigate me through every difficult situation I face. A true leader, a technology specialist, good friend and a great mentor. Cheers!!!


Manuel Betancurt – Mobile developer / Electronic Engineer. – Australia

I have had the opportunity to partake in the Data Science for the IoT course taught by Ajit Jaokar. He have crafted a collection of instructional videos, code samples, projects and social interaction with him and other students of this deep knowledge.

Ajit gives an awesome introduction and description of all the tools of the trade for a data scientist getting into the IoT. Even when I really come from a software engineering background, I have found the course totally accessible and useful. The support given by Ajit to make my IoT product a data science driven reality has been invaluable. Providing direction on how to achieve my data analysis goals and even helping me to publish the results of my investigation.

The knowledge demonstrated on this course in a mathematical and computer science level has been truly exciting and encouraging. This course was the key for me to connect the little data to the big data.


Barend Botha – London and South Africa –

This is a great course for anyone wanting to move from a development background into Data Science with specific focus on IoT. The course is unique in that it allows you to learn the theory, skills and technologies required while working on solving a specific problem of your choice, one that plays to your past strengths and interests. From my experience care is taken to give participants one to one guidance in their projects, and there is also within the course the opportunity to network and share interesting content and ideas in this growing field. Highly recommended!

- Barend Botha


Jamie Weisbrod – San Diego -

Currently there is a plethora of online courses and degrees available in data science/big data. What attracted me to joining the futuretext class “Data Science for ioT” is Ajit Jaokar. My main concern in choosing a course was how to leverage skills that I already possessed as a computer engineer. Ajit took the time to discuss how I could personalize the course for my interests.

I am currently in the midst of the basic coursework but already I have been able to network with students all over the world who are working on interesting projects. Ajit inspires a lot of people at all ages as he is also teaching young people Data science using space exploration.


 Robert Westwood – UK – Catalyst computing
“Ajit brings to the course years of experience in the industry and a great breadth of knowledge of the companies, people and research in the Data Science/IoT arena.”

Companies / Participants who have been part of the course


Tech: GE, HPE, Oracle, TCS, Wipro, HCL, HPE, Dell, Honeywell

Banking and Fintech : Goldman Sachs, ABN Amro, Nordea, Santander, BNP Paribas

Telecoms : Nokia, AT&T, Ericsson

Consulting : McKinsey, PA consulting

Automotive : Ford, Daimler, Jaguar

Retail : Coca Cola, Target

Airlines and Aircrafts : Boeing, Airbus

(Note : Above list includes participants from companies and also companies who have sponsored their personnel)

Participant Countries

We are pleased to have participants from all over the world – leading to a vibrant and a diverse learning ecosystem. A majority of our participants are from UK USA and India. But we also have participants from the following

North America: USA, Canada

Europe:  UK and Eastern Europe:  UK, Germany, France, Belgium, Poland, Russia, Norway, Italy, Finland, Ukraine, Austria, Ireland, Spain, Estonia, Sweden, Switzerland, Russia, Holland

Asia:  India, Japan, Thailand, Vietnam, Singapore

Middle East: UAE, Egypt, Iran

South America: Mexico, Brazil, Colombia, Nicaragua

Africa: South Africa, Zimbabwe

Australia and NZ: Australia



info at futuretext dot com

Timeline and Course outline

The course has three phases: Foundations, Development and Deployment. The projects are included in the deployment phase(in groups)

The Quiz is mostly in coding exercises (Unless you choose the strategic option)

Foundations (Jan – Feb)

  • The foundations of Data Science for the Enterprise with an emphasis on emerging fields like IoT and fintech.
  • A methodology for solving AI problems for the Enterprise
  • Foundations of Python
  • Tensorflow and Keras introduction
  • An end to end application (in code) for implementing Data Science for Enterprise and IoT
  • Understanding AI and Deep Learning

Development (March – April – May)

  • Machine learning implementation in detail
  • Understanding of Deep Learning concepts and implementation
  • Algorithms (ML and DL) : Multilayer Perceptron, Auto encoders, Deep Convolutional Networks, Recurrent Neural Networks, Reinforcement learning, Natural language processing
  • Implementations covering both Time series and Image
  • Unique considerations for Enterprise AI problems
  • Unique considerations for IoT (In this section, we consider the deployment models for IoT applications both for consumer and Industrial IoT – these include Edge, Complex event processing etc)
  • Considerations for specific industry verticals - insurance, Fraud detection, Anomaly detection, Churn, classification, Customer analytics etc

Deployment (June – July)

  •  A systems thinking approach for deploying complex applications
  • Understanding Docker, Microservices, Kubernetes and their role in the deployment cycle
  • Projects(in groups in a sprint / agile cycle)


info at futuretext dot com



workshop: A hands-on Introduction to Deep Learning using Keras and Tensorflow

On Saturday July 22 .. In London we have a very interesting workshop
A handson Introduction to Deep Learning using Keras and Tensorflow
by Ajit Jaokar and others

Both keras and tensorflow are hot technologies and they are a great way to get you started with Deep Learning

What you will learn First you learn the Theory and principles of Deep Learning(enough to understand Deep Learning code)
Then we will look at a tutorial in Keras and Tensorflow i.e. a handson implementation to build a basic neural network tutorial

What you need to know Python
What you need to bringA laptop with keras and tensorflow already installed

where?@fablab in Nottinghill

Barclay’s Eagle Lab
81 Palace Gardens Terrace
Notting Hill Gate
W8 4AT


This is free for participants of AI course (but not for the IoT course)

Else there is a small fee

Please email me at ajit.jaokar at if you are interested in attending

AI for Fintech course – Early discounts and limited places



The AI for fintech is a new course with limited places focused on AI design (product, development and Data) for the fintech industry.

We will first explain the end-to-end principles of AI and Deep Learning and then describe specific applications and the implications of deploying them in context of fintech

The course will be conducted by Ajit Jaokar and Jakob Aungiers 




•             Foundations of Enterprise AI

•             Understanding the application of AI for fintech

•             Introduction to TensorFlow and Keras

•             End to end implementation for an AI application


Designing an AI product

•             Basics of Designing an AI product

•             Understanding Deep learning

•             Machine learning algorithms in TensorFlow and Keras:

•             Designing with Deep Learning algorithms

Multilayer Perceptron

Auto encoders

Deep Convolutional Networks

Recurrent Neural Networks

Reinforcement learning

Natural language processing

Basics of Text Analytics


Deploying AI products for fintech

•             Methodology for Enterprise AI projects

•             Deploying Enterprise AI

•             Understanding the Enterprise AI layer

•             Acquiring Data and Training the Algorithm

•             Processing and hardware considerations

•             Business Models – High Performance Computing – Scaling and AI system

•             Costing an AI system

•             Creating a competitive advantage from AI

•             Specific considerations for fintech: ex EU payment directive (PSD2) etc


Course Logistics

The course targets designers or developers who work with fintech.

Strategic Option:  You can choose to work with the strategic option  (no coding)

Developer Option:  The full course based on development in TensorFlow and Keras

Duration: Starting July 31 2017 approximately six months

Offered Online  with video based content

Fees:         Please contact us

Contact :  [email protected]

AI for Smart Cities Lab launch

I have written about the AI for Smart cities lab before and we are close to launching it

Created by futuretext in collaboration with citysciences (UPM) and Nvidia  - the AI for Smart Cities lab explores complex problems in the deployment of AI for Smart cities

We have been working with the Nvidia Jetson product but are really looking forward to working with both Metropolis and Isaac for Smart cities

The lab will initially focus on Projects and events in London(based out of fablab), Berlin and Madrid.

Most of the initial development on the Nvidia platform will be using tensorflow

We will have an event in London on the week of June 12

More details of collaborators / team coming soon

Any questions – please email me on ajit.jaokar at

Re-thinking Enterprise business processes using Augmented Intelligence


In the 1990s, there was a popular book called Re-engineering the Corporation. Looking back now, Re-engineering certainly has had a mixed success – but it did have an impact over the last two decades. ERP deployments led by SAP and others were a direct result of the Business Process re-engineering phenomenon.

So, now, with the rise of AI: Could we think of a new form of Re-engineering the Corporation – using Artificial Intelligence? The current group of Robotic process automation companies focus on the UI layer. We could extend this far deeper into the Enterprise. Leaving aside the discussion of  the impact of AI on jobs, this could lead to augmented intelligence at the process level for employees (and hence an opportunity for people to transition their careers in the age of AI).

Here are some initial thoughts. I am exploring these ideas in more detail. This work is also a part of an AI lab we are launching in London and Berlin in partnership with UPM and Nvidia both for Enterprises and Cities

Re-thinking Enterprise business processes using Augmented Intelligence

How would you rethink Enterprise business processes using Augmented Intelligence?

To put the basics into perspective: we consider a very ‘grassroots’ meaning of AI. AI is based on Deep Learning. Deep Learning involves automatic feature detection using the data.  You could model a range of Data types (or combination thereof) using AI:

a)      Images and sound – Convolutional neural networks

b)      Transactional – ex Loan approval

c)       Sequences: including handwriting recognition via LSTMs and recurrent neural networks

d)      Text processing – ex natural language detection

e)      Behaviour understanding – via Reinforcement learning

To extend this idea to Process engineering for Enterprises and Cities, we need to

a)      Understand existing business processes

b)      Break the process down into its components

c)       Model the process using Data and Algorithms (both Deep Learning and Machine Learning)

d)      Improve the efficiency of the process by complementing the human activity with AI(Augmented intelligence)

But this just the first step: You would have to consider the wider impact of AI itself

So, here is my list / ‘stack’:

  • New processes due to disruption at the industry level (ex Uber)
  • Change of behaviour due to new processes( ex: employees collaborating with Robots as peers)
  • Improvements in current Business Processes for Enterprises: Customer services, Supply chain, Finance, Human resources, Project management, Corporate reporting, Sales and Logistics, Management
  • The GPU enabled enterprise  ex Nvidia Grid but more broadly GPUs Will Democratize Delivery of Modern Apps, More Efficient Hybridization of Workflows, Unify Compute and Graphics
  • The availability of bodies of labelled data
  • New forms of Communications: Text analytics, Natural language processing, Speech recognition, chatbots

I am exploring these ideas in more as part of my work on the Enterprise AI lab we are launching in London and Berlin in partnership with UPM and Nvidia both for Enterprises and Cities. Welcome your comments at ajit.jaokar at or @ajitjaokar



Young Data Scientist – more about forthcoming book/kickstarter

The role of a data scientist is one of the hottest jobs in the industry today – But how do we inspire the next generation of data scientists?

I have been working on the idea of Young Data Scientist for a years now – with various iterations and pivots

Its now ready to launch the next version as a book / kickstarter

The easiest way to inspire the next generation of Data Scientists is to go back to the basics – i.e. the Maths because Maths is the Universal language that underpins progress and innovation

This also aligns closely with my day job and my teaching – Data Science for Internet of Things at Oxford University

It allows me also to create the book /kickstarter using personal insights from my years of teaching

So, if you consider the maths foundations needed to learn Data Science, you could divide them into four key areas

Linear Algebra

Probability Theory and Statistics

Multidimensional Calculus


All of these are taught(in some shape or form) in high schools(13 to 17 years)

So, the book aims to build upon these foundations for a high school audience to inspire them to take up Data Science

The challenge here is to simplify and co-relate to existing maths knowledge considering the audience(13 to 17 year olds)

and most importantly to inspire!

It also would take them on a path to be Artificial Intelligence(AI) aware

Young Data Scientist will be a book, a kickstarter, a community

It will have Open source foundations

Young Data Scientist community will also work with teachers

And finally, the Young Data Scientist community will draw upon interesting examples in Space exploration, Genomics, Ecology etc

Coding will be in Python (including numpy and tensorflow sometimes)

Please email me at ajit.jaokar at

Best of my recent articles on KDnuggets and Data Science Central

I have been regularly featured on both kdnuggets and data science central.

Here is a list of my top articles recently

I discuss these ideas in the Implementing Enterprise AI course


  • Continuous improvement for IoT through AI / Continuous learning
     - 25 Nov 2016

    In reality, especially for IoT, it is not like once an analytics model is built, it will give the results with same accuracy till the end of time. Data pattern changes over the time which makes it absolutely important to learn from new data and improve/recalibrate the models to get correct result. Below article explain this phenomenon of continuous improvement in analytics for IoT.

  • Data Science for Internet of Things (IoT): Ten Differences From Traditional Data Science
     - 26 Sep 2016

    The connected devices (The Internet of Things) generate more than 2.5 quintillion bytes of data daily. All this data will significantly impact business processes and the Data Science for IoT will take increasingly central role. Here we outline 10 main differences between Data Science for IoT and traditional Data Science.
  • The Evolution of IoT Edge Analytics: Strategies of Leading Players - 02 Sep 2016
  • This article explores the significance and evolution of IoT edge analytics. Since the author believes that hardware capabilities will converge for large vendors, IoT analytics will be the key differentiator.

  • How to Become a (Type A) Data Scientist
     - 23 Aug 2016

    This post outlines the difference between a Type A and Type B data scientist, and prescribes a learning path on becoming a Type A.

Implementing Enterprise AI course – new batch – April 2017


The January batch of the Implementing AI course is completely sold out!

I am pleased to announce a new cohort for Implementing Enterprise AI course – starting April 24 2017. We are accepting places now. As usual, numbers are limited and we have an early bird discount

Implementing Enterprise AI is a unique and limited edition course that is focussed on AI Engineering / AI for the Enterprise.

Created in partnership with , the course uses Open Source technology to work with AI use cases. Successful participants will receive a certificate of completion and also validation of their project from

 To sign up or learn more, email [email protected]

The course covers

  • Design of Enterprise AI
  • Technology foundations of Enterprise AI systems
  • Specific AI use cases
  • Development of AI services
  • Deployment and Business models

 The course targets developers and Architects who want to transition their career to Enterprise AI. The course correlates the new AI ideas with familiar concepts like ERP, Data warehousing etc and helps to make the transition easier.

The implementation / development for the course is done using the H2O APIs for R, Python & Spark. 


Starting April 2017 approximately six months (3 months for the content and up to three months for the Project)
Course includes a certificate of completion and also validation of the project from (Projects will be created in a team)

 Course Logistics:

Offered Online  and Offline ( London and Berlin)

When:    April 2017
Duration: Approximately six months (including project)
Fees:      contact us

To sign up or learn more, email [email protected]


April – May 2017


  • Understanding the Enterprise AI layer
  • Introduction to Machine Learning
  • Unsupervised Learning
  • Supervised Learning
  • Generalized Linear Modeling
  • Gradient Boosting Machine
  • Ensembles
  • Random Forest
  • Programming foundations(see notes below)



  • Introduction to Deep Learning
  • Multilayer Perception
  • Auto encoders
  • Deep Convolutional Networks
  • Recurrent Neural Networks
  • Reinforcement learning
  • Programming foundations(see notes below)


July 2017


  • Natural language processing
  • Basics of Text Analytics
  • POS Tagging
  • Sentiment Analysis
  • Text Classification
  • Intelligent bots
  • Programming foundations(see notes below)


Aug – Oct 2017 – Projects and deployment


  • Deploying Enterprise AI
  • Acquiring Data and Training the Algorithm
  • Processing and hardware considerations
  • Business Models – High Performance Computing – Scaling and AI system
  • Costing an AI system
  • Creating a competitive advantage from AI
  • Industry Barriers for AI


Implementation of Enterprise AI use cases (in groups)


  • Healthcare
  • Insurance
  • Adtech
  • Fraud detection
  • Anomaly detection
  • Churn, classification
  • Customer analytics
  • Natural Language Processing, Bots and Virtual Assistants



  • The course covers Design of Enterprise AI, Technology foundations of Enterprise AI systems, Specific AI use cases, Development of AI services and Deployment and Business models
  • The implementation / development for the course is done using R, Python and Spark using the H2O APIs
  • For Deep learning, we also work with GPUs, tensoflow, Mxnet and Caffe
  • We focus on large scale problems
  • Notes on Programming foundations: We assume that you have significant Programming knowledge. However, we do not assume that you are familiar with Python, R or Spark.
  • The course provides you background in these languages over the first three months. You will then use this knowledge to work on the use cases in the Project phase Certification of completion is based on completion of quiz related to modules.
  • Project certification (validated by is based on Projects working in groups
  • Note that the syllabus is subject to change

Project certification by