How to read a book a week in the age of Facebook and twitter



When my son was nearing 14 years, I mentioned to him that I read James Clavell’s epic novel Shogun  (1136 pages) around the age of 15 i.e. his age. Reading a book like Shogun at age 15 is admittedly not typical reading for a 15-year-old. But I have always been an avid reader even as a child. Of just about everything.

But as I saw Shogun again – I realised to my horror  that I could never read and complete a 1136-page book now!

And that is worrying.

With the sheer amount of distraction – It’s hard to read a book of 1000 + pages now.

Yet I and so many others had done it … a couple of decades ago(since the book is a bestseller).

So, I attempted to read a book a week for a year in 2017

This experiment continues in 2018

This post is about how you could read a book a week

It follows some unorthodox strategies in the face of social media which I hope you will also find useful


Why read a book a week?

Firstly, why bother to read a book a week?

The ‘why’ is actually very important and its more than ‘books’

It’s better to think of this as ‘reclaiming your mind’ in an age when media (especially social media) attempts to control what we read (and hence what we think).

Numerous studies have shown that Too much tech is ruining lives and at the same time reading is one of the best indicators of career success

But if you are a thinking man or woman, all media today is far from intellectual.

The discovery channel (supposedly based on science) broadcast the  infamous megladon documentary. Not to be outdone on sensationalism, National Geographic(until recently News Corp owned) included a documentary called cocaine hippos (based on Pablo Escobar’s hippos in Colombia)

The point here is – even documentaries are sensationalist.

Even the so called scientific ones – and an hour peppered with advertisements does not give time for any detailed analysis anyway. So, the overall idea is to see this as not just reading a book – but changing our thinking

Some overall objectives:

a)       Read the whole book but aim for comprehension

b)      Read things that you do not read normally

c)       Combine the reading of both larger books and also smaller books


Here are the strategies

1)      Try to read unabridged books but use Sparknotes or similar. I would never have been able to read the  unabridged Moby Dick without Sparknotes – Moby Dick .

This strategy gives you the big picture and also helps you to read complex material with comprehension – for example if you understand the main themes.

2)      How to read a book which cannot be finished in a week? The answer is – don’t attempt to finish a book in a week. Many books take more than one week. So, at any point, you are reading more than one book – but you are always finishing a book in a week

3)      Speed reading does not work for me – but my speed of reading increased dramatically.  I am sure speed reading works for some – it did not for me.

4)      When reading literature – use comics especially Classics illustrated to start off with – before you get to the original book. I did say that my strategies are unorthodox! This strategy worked very well for me. I love comics and reading Classics illustrated allowed me to read many books which I would not have attempted otherwise

5)      Read for completion and balance with comprehension. This works quite well because you can skip the filler material in a book (especially if you combine with the other strategies)

6)      Read  Mortimer Adler’s How to read a book – first published in 1940 and still relevant today

7)      I can read business and technical books fast but many other categories not so. For example – this book although not very large – took a lot longer because the content was excellent  Lee Kwan Yew – the man and his ideas

8)      Read  Neil Postman – Amusing ourselves to Death .

It’s an iconic book – from the time of television – to understand why you need to rethink your media strategy. See this review of the book and many others online

9)      Read a range of different books across cultures – this is an excellent list – The most iconic books set in 150 countries

10)      Read in Chunks 30 pages. That means you can finish a reasonable size book (210 pages) in a week.

11)      Collect a set of book links – here are a few – Bill Gates book listTim Ferris book listJames Clear book list

12)      Try to read bigger books for the reasons I mentioned above. Till the end of the year, I have two specific books I am reading The autobiography of Marshall Zhukov (see this review of marshal of victory the autobiography of general georgy zhukov)  and Walter Isaacson’s biography of Leonardo Da Vinci



Today, I feel it’s not so fashionable to be thoughtful.  In many areas like politics – we see intelligent, thoughtful leaders being penalized.  Yet, that’s exactly why this kind of thinking is called for.  I see some notable exceptions. This thoughtful young girl is very inspiring!  How To Read More || How I Read 102 Books in a Year!

Both Buffett and Gates are avid readers The simple truth behind reading 200 books a year  - How Bill Gates reads a book . I read 48 books in 2017 and in 2018, this continues.  So, overall, this was a success and I hope you can learn from it


AI labs – a club for #AI research and a chance to gain hands-on experience with AI


We have been working on this idea over the summer and have now launched the next stage of the AI labs in London

Here are some more details.

Think of AI labs – as a club for AI research

AI labs addresses three problems

a)  Today, even if you are working on Machine Learning (rare it itself) – most people / companies are not working on Deep Learning and AI – even if they are working on Machine Learning. The lab offers you this opportunity to build AI applications and deploy in open source – thus demonstrating your expertise

b)  We will work on specific Robotics / AI deep learning technologies often to implement research papers. This is another big gap in the industry. It’s hard to take a paper and implement it on your own (especially if it’s not really a part of your day job)

c)  We will also work with AI models in the Cloud and Automated Machine Learning – both of which I believe will significantly broaden the market for AI


How we work
a) We work in sprints where you implement code and publish it in Apache v2 license. Thus, you get to demonstrate your technical expertise

b) We typically implement papers around GANs, Reinforcement Learning etc

The overall idea is to gain real experience and demonstrate it in rapid iterations using Agile/Sprint

Steps for the sprints are
a)      We present a challenge/ problem (typically code based)
b)      You choose which sprints you join
c)      The group will solve it in a sprint in collaboration in a timeframe
d)      At the end of the sprint time we provide the solution
e)      The Group then lists the best contributions i.e. who made the best efforts
f)       And if it’s a good effort we publish on an external GitHub listing the contributors

This helps you to create a real GitHub portfolio but also work with others in a collaboration.

We start off with three problems(sprints) initially

You can join more than one sprint.
The three initial sprints are likely to be

a)      Reinforcement learning
b)      NLP (Robotics)
c)      A research paper

Problems are set typically by the tech architect team which includes

Ajit Jaokar

Dr Daria Shamrai

Dr Saed Hussain

Cheuk Ting Ho

Dr Amita Kapoor


Sprints are managed by

Rama Govardhana and Dr Ahmad Abd Rabuh


Examples of technologies / papers
a)  Paper – Trajectory Optimization using Reinforcement Learning for Map
b)  The TRFL library open sourced by Deepmind
c)  Unity 3d and Deepmind 
d)  In future, AWS and Azure AI and some form of Automatic ML


Why Robotics/ Devices?
a) It’s easier to learn AI in a physical context and that knowledge is transferable ex you can apply time series to fintech
b) AI and IoT will be huge but still a niche(hence a gap)
c) It reflects my personal network including existing collaborations like Dobot and my teaching


We also pleased to collaborate with

Barend Botha (Signacore)

Devrim Sonmez

Dobot – for Robotics

eOffice (Pier Paolo and Oscar Chu Ortega).

Many thanks to Barend Botha for the Logo.

If you want to join the Lab and are interested to know more – please connect with me on Ajit Jaokar Linkedin (with a message in the connection request re Lab).  

We are still piloting this initiative so we are keeping the group small so we can learn.

We start small and grow. I am a big believer in AI and invest a lot into it. So, this is a long-term vision.

Coding for #AI and #machinelearning – online course/workshop

Coding for #AI and #machinelearning – online course/workshop

We are exploring a new way to learning how to code for AI and Machine Learning by applying ideas of  deliberate practise

Starting Oct 2018 – the workshop has limited places

Please contact [email protected] if you want to sign up for our online workshop (300 USD)


Deliberate practice is a technique which probably originated in the former Soviet Union to train world class athletes.

Deliberate practise is also used in learning complex skills like playing the violin – which require mastering many small steps and then putting the steps together in a complex whole (like a violin concert)

Deliberate practice always follows the same pattern: break the overall process down into parts, identify your weaknesses, test new strategies for each section, and then integrate your learning into the overall process.

We apply deliberate practise to learning coding for AI

To try this, we work in small sections of code which you should master bit by bit

We divide the code into four sections
1) Pandas, NumPy and MathPlotLib

2) Data manipulation and feature engineering

3) Machine Learning models and validation·

4) Deep learning

Each of these topics is divided into detail (subtopics) as below

(note the code itself is taken from the public domain and from existing books under Apache v2 license)

Pandas, NumPy and MathPlotLib
· initializing NumPy array
· Creating NumPy array
· NumPy datatypes
· Field access
· Basic slicing
· Advanced indexing
· Array math: Sum function, Transpose function
· Broadcasting
· Creating a pandas series
· Creating a pandas dataframe
· Reading / writing data from csv, text, Excel
· Basic statistics on dataframe
· Creating covariance on dataframe
· Creating correlation matrix on dataframe
· Concat or append operation, Merge, join dataframes
· Grouping operation, Pivot tables on Dataframe
· Plots, Bar charts, Pie charts etc for DataFrames

Data manipulation and feature engineering
· Converting categorical variable to numerical
· Normalization and scaling
· Univariate analysis
· Pandas dataframe visualization
· Multivariate analysis
· Correlation matrix
· Pair plot
· Scatter plots
· Find outliers
· Load data
· Normalize data
· Split data into train and test

Machine Learning models and validation·
Linear regression
· Linear regression model accuracy matrices
· Polynomial regression
· Regularization
· Nonlinear regression
· Logistic regression
· Confusion matrix
· Area Under the Curve
· Under-fitting, right-fitting, and over-fitting
· Logistic regression model training and evaluation
· Generalized Linear Model
· Decision tree model
· Support vector machine (SVM) model
· Plotting SVM decision boundaries
· k Nearest Neighbors model
· k-means clustering

Deep Learning·
sklearn perceptron code
· loading MNIST data for training MLP classifier
· sklearn MLP classifier
· Bernoulli RBM with classifier
· grid search with RBM + logistic regression
· Keras MLP
· Compile model
· Train model and evaluate
· dimension reduction using autoencoder
· de-noising using autoencoder
· Keras LSTM

Please contact [email protected] if you want to sign up for our online workshop (300 USD)

Image source: Vanessa Mae – Violin prodigy/ player

Implementing Enterprise AI – Online workshop

Implementing Enterprise AI – Online workshop


Ajit Jaokar and Cheuk Ting Ho


Early bird discounted rate $599 USD




Launched for the first time, jointly delivered by Ajit Jaokar and Cheuk Ting Ho – Implementing Enterprise AI workshop is an online workshop targeting developers and strategists.

 The workshop enables you to develop a personal strategic case study for implementing Enterprise AI.

Some knowledge of Python is good, but it is not mandatory.

The workshop focuses on the professional deployment of AI in the Enterprise along with the underlying business case

The professional deployment of AI in Enterprises differs from the content in a typical training course. In larger organisations, the Data Science function typically spans three distinct roles: The Data Engineer, the Data Scientist and the DevOps Engineer. The Data Scientist is primarily responsible for developing the Machine Learning and Deep Learning algorithms. The Data Engineer and The DevOps Engineer roles work in conjunction with the Data Scientist to manage the product/service lifecycle.

The workshop is based on the following considerations

a)      Emphasis on the full AI pipeline

b)      Understanding the business case for Enterprise AI

c)      Understanding the practical implementation considerations for Enterprise AI

d)      Adopting a pragmatic approach to balance against the media hype

e)      Developing a case study as part of the workshop

f)       Adopting a Use Case approach



Module One: Concepts

In this module, we cover the following:

  • What is Enterprise AI
  • Why use AI in enterprises
  • Industry landscape of enterprise AI
  • Understanding different issues and strategies in Enterprise AI
  • Architectural design considerations for AI in enterprises
  • Understand what problems Enterprise AI solves
  • Understand the issues behind deploying Enterprise AI applications in scale for Enterprises
  • Study the strategies of large scale Enterprise AI vendors (Azure, Amazon and Google)
  • Use cases


Module Two: AI models used in the Enterprise

In this section, we study AI / Deep learning algorithms. These include:

  • Overview of Machine learning and Deep Learning
  • Deep-learning techniques. MLP, CNN, RNN, Autoencoders, Variational Autoencoders, GANs


Module Three: Enterprise AI Business case

Broadly, we can consider an AI process as one which improves with experience

The business case for Enterprise AI is a moving goalpost. It is driven by a number of considerations which we cover in this module


Module Four: Unique considerations for Enterprise AI

Here, we cover specific drivers for Enterprise AI

  • Explainable AI
  • Automatic Machine Learning
  • New regulatory structures responsible for AI ex GDPR, Payment regulation etc


Module Five: AI and the Cloud – what does it mean

In this module, we look at the issues of deploying AI models in the Cloud including

  • Accessibility to large amount of compute in the cloud
  • Pay-as-you-go – leverage GPU machines when you need them without upfront costs
  • Containerisation of models means ability to train and package in the cloud but deploy anywhere
  • Model management for version control and fresh relevant models when needed
  • Inference and devices
  • Access to services in the streaming, big data and ML space that can help you build complex architectures without worrying about the underlying infrastructure


Module Six: Code walkthrough – Credit approval

Code walkthrough with an end to end example for Enterprise AI (Credit approval)


Module Seven: Code walkthrough – NLP

Code walkthrough with an end to end example for Enterprise AI (Natural Language Processing)


Module Eight: AI – Spark and Streaming

An understanding of the Streaming architecture including Kafka, Spark (PySpark)and MongoDB for large scale deployment of AI in Enterprises


Module Nine: End to End Pipeline for AI with CICD

In this module, we explore the real world / large scale deployment of AI including Continuous Innovation and Continuous Delivery(CICD). DevOps creates a culture of increased collaboration, ownership and accountability enabling teams to create large-scale applications. CICD (Continuous Improvement/Continuous Delivery). CI/CD can be seen as an evolution of Waterfall and Agile methodologies.


Module Ten: Case study exercise review

Review of class exercise and case study (online and in group)


Questions and contact details

  • The workshop is delivered through pre-recorded videos which you can view at any time
  • Code is used to demonstrate the application as a walkthrough. This is not a coding/hands-on workshop and the code is for illustrative purposes (and not supported in the workshop). You will have access to the code in Apache v2 license on an as-is basis.
  • Exercise Case study – developed in the group based on the material. You can choose the theme for the case study related to Enterprise AI covering the material in the course. Note that the case study does not cover the code.
  • Duration – you have up to three months to complete the case study exercise in the group. Modules will be posted once a week.
  • The course is industry led. It is not affiliated with an academic institution
  • The course includes a certificate of completion
  • Please contact ajit.jaokar at for signup and any other questions





Based in London, Ajit’s work spans research, entrepreneurship and academia relating to Artificial Intelligence (AI) and Internet of Things (IoT).  Ajit works as a Data Scientist (Bioinformatics and IoT domains). He is the course director at Oxford University on “Data Science for Internet of Things”. Besides Oxford University, Ajit has also conducted AI courses in LSE, UPM and part of the Harvard Kennedy Future society research on AI.

Cheuk Ting Ho is a Data Scientist in Machine Learning and Deep Learning. She contributes regularly to the Data Science community by being public speaker, encouraging women in technology, and actively contributes to Python open source projects.



Learn AI and Data Science rapidly based only on high school math


What if you could learn AI and Data Science based on knowledge you already know?

You have an opportunity to accelerate your learning of AI in a unique way through this limited, early bird offer

Here is a simple observation:

The mathematical foundations of Data Science rest on four elements i.e. Linear Algebra, Probability Theory, Multivariate Calculus, and Optimization theory.

 Most of these are taught (at least partially) in high schools. 

In this program, we use these maths foundations (which you learnt in High school) to teach the foundations of Data Science and Artificial Intelligence

The program is interactive and personalized for a duration of three months. It includes a certificate of completion. Coding examples are in Python (Pandas and NumPy)

The program delivery is by video. The program also includes a copy of a pdf book.

Starting on July 1, 2018. Limited places. Price 300 USD Available for a limited time only

Contact [email protected]

Created by Ajit Jaokar: Based in London, Ajit Jaokar works in Data Science and Artificial Intelligence. He teaches at Oxford University (Data Science for Internet of Things).

Image source: Shutter stock

AI labs – Learning unsupervised learning through Robotics

We are launching an AI lab. The goal is to learn unsupervised learning through Robotics (Cobots)

Long seen as a poor cousin to supervised learning -  with Variational autoencoders, Reinforcement learning and  Generative-Adversarial networks , unsupervised learning techniques have moved beyond the limitations of autoencoders.

From Oct 2018 to March 2019 , we are running a pilot. The lab will be focused on Open Source and possibly as a future B corp (social enterprise)

The lab will be launched as an educational venture. It provides a way to work with projects and to experiment with innovation.

Build and demo to group i.e. build a repository i.e. github portfolio



Starting in Oct To March

Membership only

Limited places - London only initially


Technical goals

To explore AI through Cobots – Collaborative robots (specifically the dobot) using unsupervised learning – specifically Variational autoencoders, Reinforcement learning and  Generative-Adversarial networks 

All code in Open source (Apache v2 code)

Focused on AI models for robotics (NLP, Reinforcement learning, CNN, Computer Vision etc) and also ROS (Robot operating system). Exploring complex problems like training of intelligent agents through simulation engines



A few of the meet-up members have volunteered to be part of the Tech advisory. Many thanks to the group!

Tech leadership team comprises



Ajit Jaokar

Dan Howarth

Dr Saed Hussain

Cheuk Ting Ho

Andy Bovey

Dr Ahmad Abd Rabuh

Barend  Botha(as signacore)


Outside the UK

Dr Amita Kapoor  (New Delhi)

Dr Daria Shamrai (St Petersburg – Russia)


Also pleased to collaborate for the Pilot in with eOffice – same as our office in London(Pier Paolo Mucelli, Oscar Chu Ortega)


Project leaders to be announced soon



Small fee (400 GBP/6 months)




Evenings eOffice

Weekends TBC


Under 18s?

In near future

Will be considered at no cost (but accompanied)



Python + Tensorflow using colab



Berlin and Milan


Comments welcome. Please email me if you want to join this initiative info at newsletter

AI / Deep Learning applications course – with hands-on experience



New workshop in London / remote

Link - Enterprise AI workshop – Sep 2018 – in London or remote



AI / Deep Learning applications course/ mentoring program – hands-on experience with limited spaces

I am pleased to announce a new course on AI Applications

The course combines elements of teaching, coaching and community. For this reason, the batch sizes are small and selective. I will be working with a small/selective group of people to actively transfer their career to AI through education and my network towards specific outcomes/goals. This course involves hands-on experience (if you are interested in this)

Early bird discounts now for a limited time ..

In a nutshell

  • The course (spanning three months) covers concepts, theory and coding for AI (in Python ie. Tensorflow and Keras) and also deployment.
  • Career guidance and mentoring
  • As a part of the mentoring process, you outline your goals and we work toward them. The mentoring may extend beyond the course.
  • You thus get access to a community and lifetime access for the content.
  • Starting June 1 2018
  • Audience: For professionals who want to learn AI concepts, applications and coding for AI (Deep Learning) from an application perspective. The course targets developers and Architects who want to transition their career to Enterprise AI. The course correlates the new AI ideas with familiar concepts like ERP, Data warehousing etc and helps to make the transition easier.
  • Pre-requisites Some basic knowledge of Python needed (or any programming language)
  • Coding using Tensorflow and Keras
  • Goals: To rapidly transition your career to AI in a supporting/mentoring/self-paced community
  • Delivery format is via video and online sessions (once every two weeks)
  • You can choose a strategic (non-coding) option also
  • Please contact [email protected] for more details and Pricing. See testimonials below
  • covers AI for IoT and Enterprise
  • course is personalized and the quiz is by milestone
  • The course involves hands-on experience. This means, you participate in an existing project in sprints. Note that this is different from the project you will do as part of the course. The hands-on project involves and open source project where you will work in sprints. The hands-on project is optional and helps you with gaining real-life experience

The course is unique in 3 ways

a)  Personalized and small numbers(I actually talk to everyone!). There is no cost for personalization
b)  Projects are by interest and vertical domain. we use a common codebase and then create small groups by interest(ex insurance, bioinformatics etc)
c)  you have lifetime access to the content

Course outline

Through the course, every participant should be able to develop a reusable codebase / library for solving problems using Tensorflow and Keras. This library then enables you to reuse the code in your applications.
The course comprises three milestones

Concepts milestone

In the Concepts milestone, we take a use case approach. We cover AI, IoT, Machine Learning, Platforms, Applications. We also discuss an overall methodology of applying AI techniques to  Enterprise and IoT problems. ,;’

Development milestone
In the development phase, we first cover the basics of development in Python (Tensorflow and Keras) for machine learning applications
We then cover three models (MLP, CNN and LSTM)
We also cover the theory of MLP, CNN and LSTM
Finally, we cover Python in more detail through development of a set of techniques for Deep learning applications
Deployment milestone
In this section, we discuss the deployment of Deep learning applications through Flask, Docker, Kubernetes and other real-world techniques

Testimonials for our courses

Jean Jacques Bernand – Paris – France

“Great course with many interactions, either group or one to one that helps in the learning. In addition, tailored curriculum to the need of each student and interaction with companies involved in this field makes it even more impactful. As for myself, it allowed me to go into topics of interests that help me in reshaping my career.”

Johnny Johnson, AT&T – USA

“This DSIOT course is a great way to get up-to-speed.  The tools and methodologies for managing devices, wrangling and fusing data, and being able to explain it are taking form fast; Ajit Jaokar is a good fit.  For me, his patience and vision keep this busy corporate family man coming back.”

Yongkang Gao, General Electric, UK.

“I especially thank Ajit for his help on my personal project of the course — recommending proper tools and introducing mentors to me, which significantly reduced my pain in the beginning stage.”

karthik padmanabhan Manager – Global Data Insight and Analytics (GDIA) – Ford Motor Pvt Ltd.

“I am delighted to provide this testimonial to Ajit Jaokar who has extended outstanding support and guidance as my mentor during the entire program on Data science for IoT. Ajit is a world renowned professional in the niche area of applying the Data science principles in creating IoT apps. Talking about the program, it has a lot of breadth and depth covering some of the cutting edge topics in the industry such as Sensor Fusion, Deep Learning oriented towards the Internet of things domain. The topics such as Statistics, Machine Learning, IoT Platforms, Big Data and more speak about the complexity of the program. This is the first of its kind program in the world to provide Data Science training especially on the IoT domain and I feel fortunate to be part of the batch comprising of participants from different countries and skill sets. Overall this journey has transformed me into a mature and confident professional in this new space and I am grateful to Ajit and his team. My wish is to see this program accepted as a gold standard in the industry in the coming years”.


Peter Marriott – UK –

Attending the Data Science for IoT course has really helped me in demystifying the tools and practices behind machine learning and has allowed me to move from an awareness of machine learning to practical application.


Yair Meidan Israel –

“As a PhD student with an academic and practical experience in analytics, the DSIOT course is the perfect means by which I extend my expertise to the domain of IoT. It gradually elaborates on IoT concepts in general, and IoT analytics in particular. I recommend it to any person interested in entering that field. Thanks Ajit!”


Parinya Hiranpanthaporn, Data Architect and Advanced Analytics professional Bangkok

“Good content, Good instructor and Good networking. This course totally answers what I should know about Data Science for Internet of Things.”


Sibanjan Das – Bangalore

Ajit helped me to focus and set goals for my career that is extremely valuable. He stands by my side for every initiative I take and helps me to navigate me through every difficult situation I face. A true leader, a technology specialist, good friend and a great mentor. Cheers!!!


Manuel Betancurt – Mobile developer / Electronic Engineer. – Australia

I have had the opportunity to partake in the Data Science for the IoT course taught by Ajit Jaokar. He have crafted a collection of instructional videos, code samples, projects and social interaction with him and other students of this deep knowledge.

Ajit gives an awesome introduction and description of all the tools of the trade for a data scientist getting into the IoT. Even when I really come from a software engineering background, I have found the course totally accessible and useful. The support given by Ajit to make my IoT product a data science driven reality has been invaluable. Providing direction on how to achieve my data analysis goals and even helping me to publish the results of my investigation.

The knowledge demonstrated on this course in a mathematical and computer science level has been truly exciting and encouraging. This course was the key for me to connect the little data to the big data.


Barend Botha – London and South Africa –

This is a great course for anyone wanting to move from a development background into Data Science with specific focus on IoT. The course is unique in that it allows you to learn the theory, skills and technologies required while working on solving a specific problem of your choice, one that plays to your past strengths and interests. From my experience care is taken to give participants one to one guidance in their projects, and there is also within the course the opportunity to network and share interesting content and ideas in this growing field. Highly recommended!

- Barend Botha


Jamie Weisbrod – San Diego –

Currently there is a plethora of online courses and degrees available in data science/big data. What attracted me to joining the futuretext class “Data Science for ioT” is Ajit Jaokar. My main concern in choosing a course was how to leverage skills that I already possessed as a computer engineer. Ajit took the time to discuss how I could personalize the course for my interests.

I am currently in the midst of the basic coursework but already I have been able to network with students all over the world who are working on interesting projects. Ajit inspires a lot of people at all ages as he is also teaching young people Data science using space exploration.


 Robert Westwood – UK – Catalyst computing

“Ajit brings to the course years of experience in the industry and a great breadth of knowledge of the companies, people and research in the Data Science/IoT arena.”

Companies / Participants who have been part of our courses

  • Technology: GE, HPE, Oracle, TCS, Wipro, HCL, HPE, Dell, Honeywell
  • Banking and Fintech : Goldman Sachs, ABN Amro, Nordea, Santander, BNP Paribas
  • Telecoms : Nokia, AT&T, Ericsson
  • Consulting : McKinsey, PA consulting
  • Automotive : Ford, Daimler, Jaguar
  • Retail : Coca Cola, Target
  • Airlines and Aircrafts : Boeing, Airbus

(Note : Above list includes participants from companies and also companies who have sponsored their personnel)

Participant Countries

We are pleased to have participants from all over the world – leading to a vibrant and a diverse learning ecosystem. A majority of our participants are from UK USA and India. But we also have participants from the following

  • North America: USA, Canada
  • Europe:  UK and Eastern Europe:  UK, Germany, France, Belgium, Poland, Russia, Norway, Italy, Finland, Ukraine, Austria, Ireland, Spain, Estonia, Sweden, Switzerland, Russia, Holland
  • Asia:  India, Japan, Thailand, Vietnam, Singapore
  • Middle East: UAE, Egypt, Iran
  • South America: Mexico, Brazil, Colombia, Nicaragua
  • Africa: South Africa, Zimbabwe
  • Australia and NZ: Australia


Please contact [email protected] for more details


IOTA – The potential to drive Data Science for IoT

I have a close circle of clued-on/tech savvy friends whose views I take seriously. For the last few weeks, one of these friends has been sending me emails extolling the merits of something called IOTA – which calls itself as the next generation Blockchain.  At first, I thought of IOTA as yet another cryptocurrency. A whole flock of people are rebranding themselves as Bitcoin/ Blockchain / ICO experts and spamming me! So, I was initially sceptical of something that can be the ‘next generation blockchain’. But some more investigation over the holiday season has convinced me that IOTA could be a game changer. In this post, I explore the significance of IOTA and its implications for IoT analytics.

I explore such concepts in my course  Implementing Enterprise AI course using TensorFlow and Keras

I plan to also explore IOTA in my course @Oxford University Data Science for Internet of Things @Oxford University


What is (and what is not) IOTA

Before we proceed, this discussion is not about Bitcoin. I am not an expert on Bitcoin. For a discussion on Bitcoin, see what is bitcoin and why it matters from MIT tech review. IOTA is a cryptocurrency. But I am not an expert on cryptocurrencies either (ex the factors driving the price of the currency).  I am more interested in the problem IOTA solves and it’s disruptive potential – especially for IoT. To understand the disruptive potential of IOTA, we first need to understand what IOTA is (and what it is not).

Like blockchain, IOTA is a distributed ledger technology – but it aims to go beyond  blockchain. Bitcoin uses blockchain technology and a distributed ledger system to conduct transactions. But IOTA does not use Blockchain. Instead it uses something called the Tangle distributed Ledger which comprises of directed acyclic graphs, or DAGs. The activity of users of the system propagates the Tangle distributed ledger. This does not need fees, miners or creation of new tokens. In contrast, to propagate the Bitcoin ledger, miners need to perform computational work (or pay for previously mined Bitcoins).

Also, Bitcoin has a scalability problem because as more people start to use the system, it gets slower and it costs higher to process a transaction.  In contrast, the cost of using the IOTA ledger(Tangle) is the cost of the user’s computational effort to verify two randomly selected existing prior tangle ledger sites (transactions).

Source: IOTA-Whitepaper The main point is: “Every new transaction must approve two other transactions.” In this sense, IOTA is an attempt to potentially create a superior cryptocurrency platform by overcoming the limitations of Blockchain.

What problem does IOTA solve and why does that matter

If all IOTA created was a ‘better blockchain’ It would be interesting but not disruptive. But IOTA can be really disruptive with IoT. For example, an IoT sensor in a car could retrieve data from the factory automatically. IoT devices can connect and transact with each other in a peer to peer mode.  Potentially, it could help 50 billion devices to connect.  IOTA is getting traction(despite some recent hiccups). The IOTA Foundation announced that Robert Bosch Venture Capital GmbH (RBVC) — the corporate venture capital company of the Bosch Group — has purchased a significant number of IOTA tokens. Dr. Hongquan Jiang, partner at RBVC, will also join the IOTA Foundation’s advisory board. The core feature of IOTA is the ability ford devices  to transfer data through the Tangle. With recent extensions to the Core, IOTA can even operate in ‘one to many’ mode i.e. the ability to broadcast messages to devices.

What is the implication of IOTA for Data Science for IoT

The ability to manage and share data securely has profound implications for next generation IoT applications such as self driving cars, Drones etc. Such devices would need to collaborate within peers. A leasing model for devices could arise instead of an ownership model. That leasing/ collaboration model could also extend to data arising from IoT devices. Furthermore, interaction between devices can happen autonomously. IOTA could thus be the backbone for IoT applications.

If “Data is the next Oil” makes you cringe .. this is for you

The term ‘Data is the next Oil’ often makes me cringe .. because it’s your data and their Oil! But for a long time, there was not much you could do about it. At least for sensor data, IOTA offers a potentally disruptive way out and yet help to foster an ecosystem.  If you want to work with me on ideas such as this, I explore such concepts in my course  Implementing Enterprise AI course using TensorFlow and Keras

Comment and Disclosure

a)  The post is narrowly confined to the potential of IOTA for IoT and DataScience for IoT

b)  I do not claim any expertise or knowledge of IOTA as a cryptocurrency

c)  The cryptographic security discussions re IOTA are also not in scope (and I am not an expert on this)

d)  I do not hold any IOTA currency at the time of writing

agilePHM – a new open source product for rapid prototyping of PHM analytics


We are launching a new product called agilePHM

In Industrial IoT, I have been working with PHM  (Prognostics and Health Management) for a while and it is a well known discipline

Prognostics and Health management(PHM)  is an engineering discipline focused on predicting the time at which a system or a component will no longer perform its intended function. This lack of performance is most often a failure beyond which the system can no longer be used to meet desired performance. The predicted time then becomes the remaining useful life (RUL), which is an important concept in decision making for contingency mitigation. Prognostics predicts the future performance of a component by assessing the extent of deviation or degradation of a system from its expected normal operating conditions.The science of prognostics is based on the analysis of failure modes, detection of early signs of wear and aging, and fault conditions. An effective prognostics solution is implemented when there is sound knowledge of the failure mechanisms that are likely to cause the degradations leading to eventual failures in the system. (source wikiedia)


PHM applies to a range of domains like defence, shipping, industrial applications etc

We are developing a product for rapid prototyping of the analytics component of PHM

agiilePHM is designed for for rapid prototyping of PHM applications (implemented on standard hardware)

The need for the product

The idea of agilePHM arose due to a few observations

a)  There is a need for rapid prototyping of new ideas(from an analytics standpoint) as an exclusive function: Industrial IoT is a very new space and evolving. Ideas from different domains cross-pollinate and there is a need to quickly test out concepts(either products or processes)

b)   Data Science skills shortage: Data science skills are expensive and are often focused on industries like Banking(in contrast to Industrial IoT). So, think of agilePHM like ‘Data Scientist in a  box’ for the Industrial IoT space

c)  larger products have a much heavier footprint: Our customer is someone who wants to rapidly prototype the model (without knowing the algorithms in detail). Larger products have a much heavier footprint. Many seem like installing ERP in the old days! They perform the function of rapid prototyping as a small component (as opposed to exclusive emphasis on it)

d)  Flexibility: The approach complements existing approaches like Physics based modelling

e)  Why open source ..  Our main strength lies in IoT analytics (Ajit Jaokar teaches a course on Data Science for Internet of Things at the University of Oxford). However, the problem we address is complex because there are many processes (and many machines!) to abstract algorithmically. This needs some form of open source.


agilePHM has three components

1) Digital Twin

2) Rapid prototyping

3) Workflow – Process engineering

agilePHM will have the following deployment models
On premise with support

Open source

Kaggle like contest community engagement

It also allows students in our course to gain real life / practical experience


a)   If you are a company interested in working with us, please email me on ajit.jaokar at

b) If you are interested in gaining real experience in AI .. you can work on the product with companies as part of our course. Please contact info at to know more