workshop: A hands-on Introduction to Deep Learning using Keras and Tensorflow

On Saturday July 22 .. In London we have a very interesting workshop
A handson Introduction to Deep Learning using Keras and Tensorflow
by Ajit Jaokar and others

Both keras and tensorflow are hot technologies and they are a great way to get you started with Deep Learning

What you will learn First you learn the Theory and principles of Deep Learning(enough to understand Deep Learning code)
Then we will look at a tutorial in Keras and Tensorflow i.e. a handson implementation to build a basic neural network tutorial

What you need to know Python
What you need to bringA laptop with keras and tensorflow already installed

where?@fablab in Nottinghill

Barclay’s Eagle Lab
81 Palace Gardens Terrace
Notting Hill Gate
London
W8 4AT

Pricing

This is free for participants of AI course (but not for the IoT course)

Else there is a small fee

Please email me at ajit.jaokar at futuretext.com if you are interested in attending

AI for Fintech course – Early discounts and limited places

Introduction

 

The AI for fintech is a new course with limited places focused on AI design (product, development and Data) for the fintech industry.

We will first explain the end-to-end principles of AI and Deep Learning and then describe specific applications and the implications of deploying them in context of fintech

The course will be conducted by Ajit Jaokar and Jakob Aungiers 

 

Outline

Foundations

•             Foundations of Enterprise AI

•             Understanding the application of AI for fintech

•             Introduction to TensorFlow and Keras

•             End to end implementation for an AI application

 

Designing an AI product

•             Basics of Designing an AI product

•             Understanding Deep learning

•             Machine learning algorithms in TensorFlow and Keras:

•             Designing with Deep Learning algorithms

Multilayer Perceptron

Auto encoders

Deep Convolutional Networks

Recurrent Neural Networks

Reinforcement learning

Natural language processing

Basics of Text Analytics

 

Deploying AI products for fintech

•             Methodology for Enterprise AI projects

•             Deploying Enterprise AI

•             Understanding the Enterprise AI layer

•             Acquiring Data and Training the Algorithm

•             Processing and hardware considerations

•             Business Models – High Performance Computing – Scaling and AI system

•             Costing an AI system

•             Creating a competitive advantage from AI

•             Specific considerations for fintech: ex EU payment directive (PSD2) etc

 

Course Logistics

The course targets designers or developers who work with fintech.

Strategic Option:  You can choose to work with the strategic option  (no coding)

Developer Option:  The full course based on development in TensorFlow and Keras

Duration: Starting July 31 2017 approximately six months

Offered Online  with video based content

Fees:         Please contact us

Contact :[email protected]

AI for Smart Cities Lab launch

I have written about the AI for Smart cities lab before and we are close to launching it

Created by futuretext in collaboration with citysciences (UPM) and Nvidia  - the AI for Smart Cities lab explores complex problems in the deployment of AI for Smart cities

We have been working with the Nvidia Jetson product but are really looking forward to working with both Metropolis and Isaac for Smart cities

The lab will initially focus on Projects and events in London(based out of fablab), Berlin and Madrid.

Most of the initial development on the Nvidia platform will be using tensorflow

We will have an event in London on the week of June 12

More details of collaborators / team coming soon

Any questions – please email me on ajit.jaokar at futuretext.com

Re-thinking Enterprise business processes using Augmented Intelligence

 

In the 1990s, there was a popular book called Re-engineering the Corporation. Looking back now, Re-engineering certainly has had a mixed success – but it did have an impact over the last two decades. ERP deployments led by SAP and others were a direct result of the Business Process re-engineering phenomenon.

So, now, with the rise of AI: Could we think of a new form of Re-engineering the Corporation – using Artificial Intelligence? The current group of Robotic process automation companies focus on the UI layer. We could extend this far deeper into the Enterprise. Leaving aside the discussion of  the impact of AI on jobs, this could lead to augmented intelligence at the process level for employees (and hence an opportunity for people to transition their careers in the age of AI).

Here are some initial thoughts. I am exploring these ideas in more detail. This work is also a part of an AI lab we are launching in London and Berlin in partnership with UPM and Nvidia both for Enterprises and Cities

Re-thinking Enterprise business processes using Augmented Intelligence

How would you rethink Enterprise business processes using Augmented Intelligence?

To put the basics into perspective: we consider a very ‘grassroots’ meaning of AI. AI is based on Deep Learning. Deep Learning involves automatic feature detection using the data.  You could model a range of Data types (or combination thereof) using AI:

a)      Images and sound – Convolutional neural networks

b)      Transactional – ex Loan approval

c)       Sequences: including handwriting recognition via LSTMs and recurrent neural networks

d)      Text processing – ex natural language detection

e)      Behaviour understanding – via Reinforcement learning

To extend this idea to Process engineering for Enterprises and Cities, we need to

a)      Understand existing business processes

b)      Break the process down into its components

c)       Model the process using Data and Algorithms (both Deep Learning and Machine Learning)

d)      Improve the efficiency of the process by complementing the human activity with AI(Augmented intelligence)

But this just the first step: You would have to consider the wider impact of AI itself

So, here is my list / ‘stack’:

  • New processes due to disruption at the industry level (ex Uber)
  • Change of behaviour due to new processes( ex: employees collaborating with Robots as peers)
  • Improvements in current Business Processes for Enterprises: Customer services, Supply chain, Finance, Human resources, Project management, Corporate reporting, Sales and Logistics, Management
  • The GPU enabled enterprise  ex Nvidia Grid but more broadly GPUs Will Democratize Delivery of Modern Apps, More Efficient Hybridization of Workflows, Unify Compute and Graphics
  • The availability of bodies of labelled data
  • New forms of Communications: Text analytics, Natural language processing, Speech recognition, chatbots

I am exploring these ideas in more as part of my work on the Enterprise AI lab we are launching in London and Berlin in partnership with UPM and Nvidia both for Enterprises and Cities. Welcome your comments at ajit.jaokar at futuretext.com or @ajitjaokar

 

 

Young Data Scientist – more about forthcoming book/kickstarter

The role of a data scientist is one of the hottest jobs in the industry today – But how do we inspire the next generation of data scientists?

I have been working on the idea of Young Data Scientist for a years now – with various iterations and pivots

Its now ready to launch the next version as a book / kickstarter

The easiest way to inspire the next generation of Data Scientists is to go back to the basics – i.e. the Maths because Maths is the Universal language that underpins progress and innovation

This also aligns closely with my day job and my teaching – Data Science for Internet of Things at Oxford University

It allows me also to create the book /kickstarter using personal insights from my years of teaching

So, if you consider the maths foundations needed to learn Data Science, you could divide them into four key areas

Linear Algebra

Probability Theory and Statistics

Multidimensional Calculus

Optimization

All of these are taught(in some shape or form) in high schools(13 to 17 years)

So, the book aims to build upon these foundations for a high school audience to inspire them to take up Data Science

The challenge here is to simplify and co-relate to existing maths knowledge considering the audience(13 to 17 year olds)

and most importantly to inspire!

It also would take them on a path to be Artificial Intelligence(AI) aware

Young Data Scientist will be a book, a kickstarter, a community

It will have Open source foundations

Young Data Scientist community will also work with teachers

And finally, the Young Data Scientist community will draw upon interesting examples in Space exploration, Genomics, Ecology etc

Coding will be in Python (including numpy and tensorflow sometimes)

Please email me at ajit.jaokar at futuretext.com

Best of my recent articles on KDnuggets and Data Science Central

I have been regularly featured on both kdnuggets and data science central.

Here is a list of my top articles recently

I discuss these ideas in the Implementing Enterprise AI course

 kdnuggets


  • Continuous improvement for IoT through AI / Continuous learning
     - 25 Nov 2016

    In reality, especially for IoT, it is not like once an analytics model is built, it will give the results with same accuracy till the end of time. Data pattern changes over the time which makes it absolutely important to learn from new data and improve/recalibrate the models to get correct result. Below article explain this phenomenon of continuous improvement in analytics for IoT.

  • Data Science for Internet of Things (IoT): Ten Differences From Traditional Data Science
     - 26 Sep 2016

    The connected devices (The Internet of Things) generate more than 2.5 quintillion bytes of data daily. All this data will significantly impact business processes and the Data Science for IoT will take increasingly central role. Here we outline 10 main differences between Data Science for IoT and traditional Data Science.
  • The Evolution of IoT Edge Analytics: Strategies of Leading Players - 02 Sep 2016
  • This article explores the significance and evolution of IoT edge analytics. Since the author believes that hardware capabilities will converge for large vendors, IoT analytics will be the key differentiator.

  • How to Become a (Type A) Data Scientist
     - 23 Aug 2016

    This post outlines the difference between a Type A and Type B data scientist, and prescribes a learning path on becoming a Type A.

Implementing Enterprise AI course – new batch – April 2017

 

The January batch of the Implementing AI course is completely sold out!

I am pleased to announce a new cohort for Implementing Enterprise AI course – starting April 24 2017. We are accepting places now. As usual, numbers are limited and we have an early bird discount

Implementing Enterprise AI is a unique and limited edition course that is focussed on AI Engineering / AI for the Enterprise.

Created in partnership with H2O.ai , the course uses Open Source technology to work with AI use cases. Successful participants will receive a certificate of completion and also validation of their project from H2O.ai.

 To sign up or learn more, email [email protected]

The course covers

  • Design of Enterprise AI
  • Technology foundations of Enterprise AI systems
  • Specific AI use cases
  • Development of AI services
  • Deployment and Business models

 The course targets developers and Architects who want to transition their career to Enterprise AI. The course correlates the new AI ideas with familiar concepts like ERP, Data warehousing etc and helps to make the transition easier.

The implementation / development for the course is done using the H2O APIs for R, Python & Spark. 

 

Duration:
Starting April 2017 approximately six months (3 months for the content and up to three months for the Project)
Course includes a certificate of completion and also validation of the project from H2O.ai. (Projects will be created in a team)

 Course Logistics:

Offered Online  and Offline ( London and Berlin)

When:    April 2017
Duration: Approximately six months (including project)
Fees:      contact us

To sign up or learn more, email [email protected]

 Outline

April – May 2017

 

  • Understanding the Enterprise AI layer
  • Introduction to Machine Learning
  • Unsupervised Learning
  • Supervised Learning
  • Generalized Linear Modeling
  • Gradient Boosting Machine
  • Ensembles
  • Random Forest
  • Programming foundations(see notes below)

 

June

  • Introduction to Deep Learning
  • Multilayer Perception
  • Auto encoders
  • Deep Convolutional Networks
  • Recurrent Neural Networks
  • Reinforcement learning
  • Programming foundations(see notes below)

 

July 2017

 

  • Natural language processing
  • Basics of Text Analytics
  • POS Tagging
  • Sentiment Analysis
  • Text Classification
  • Intelligent bots
  • Programming foundations(see notes below)

 

Aug – Oct 2017 – Projects and deployment

 

  • Deploying Enterprise AI
  • Acquiring Data and Training the Algorithm
  • Processing and hardware considerations
  • Business Models – High Performance Computing – Scaling and AI system
  • Costing an AI system
  • Creating a competitive advantage from AI
  • Industry Barriers for AI

 

Implementation of Enterprise AI use cases (in groups)

 

  • Healthcare
  • Insurance
  • Adtech
  • Fraud detection
  • Anomaly detection
  • Churn, classification
  • Customer analytics
  • Natural Language Processing, Bots and Virtual Assistants

 

Notes

  • The course covers Design of Enterprise AI, Technology foundations of Enterprise AI systems, Specific AI use cases, Development of AI services and Deployment and Business models
  • The implementation / development for the course is done using R, Python and Spark using the H2O APIs
  • For Deep learning, we also work with GPUs, tensoflow, Mxnet and Caffe
  • We focus on large scale problems
  • Notes on Programming foundations: We assume that you have significant Programming knowledge. However, we do not assume that you are familiar with Python, R or Spark.
  • The course provides you background in these languages over the first three months. You will then use this knowledge to work on the use cases in the Project phase Certification of completion is based on completion of quiz related to modules.
  • Project certification (validated by H2O.ai) is based on Projects working in groups
  • Note that the syllabus is subject to change

Project certification by h2o.ai

Twelve types of Artificial Intelligence (AI) problems

Background – How many cats does it take to identify a

Cat?

In this article, I cover the 12 types of AI problems i.e. I address the question : in which scenarios should you use Artificial Intelligence (AI)?  We cover this space in the  Enterprise AI course

Some background:

Recently, I conducted a strategy workshop for a group of senior executives running a large multi national. In the workshop, one person asked the question: How many cats does it need to identify a Cat?

This question is in reference to Andrew Ng’s famous paper on Deep Learning where he was correctly able to identify images of Cats from YouTube videos. On one level, the answer is very clear: because Andrew Ng lists that number in his paper. That number is 10 million images .. But the answer is incomplete because the question itself is limiting since there are a lot more details in the implementation – for example training on a cluster with 1,000 machines (16,000 cores) for three days. I wanted to present a more detailed response to the question. Also, many problems can be solved using traditional Machine Learning algorithms – as per an excellent post from Brandon Rohrer – which algorithm family can answer my question. So, in this post I discuss problems that can be uniquely addressed through AI. This is not an exact taxonomy but I believe it is comprehensive. I have intentionally emphasized Enterprise AI problems because I believe AI will affect many mainstream applications – although a lot of media attention goes to the more esoteric applications.

 

What problem does Deep Learning address?

What is Deep Learning?

Firstly, let us explore what is Deep Learning

 

Deep learning refers to artificial neural networks that are composed of many layers. The ‘Deep’ refers to multiple layers. In contrast, many other machine learning algorithms like SVM are shallow because they do not have a Deep architecture through multiple layers. The Deep architecture allows subsequent computations to build upon previous ones. We currently have deep learning networks with 10+ and even 100+ layers.

 

The presence of multiple layers allows the network to learn more abstract features. Thus, the higher layers of the network can learn more abstract features building on the inputs from the lower layers.  A Deep Learning network can be seen as a Feature extraction layer with a Classification layer on top. The power of deep learning is not in its classification skills, but rather in its feature extraction skills. Feature extraction is automatic (without human intervention) and multi-layered.

 

The network is trained by exposing it to a large number of labelled examples. Errors are detected and the weights of the connections between the neurons adjusted to improve results. The optimisation process is repeated to create a tuned network. Once deployed, unlabelled images can be assessed based on the tuned network.

 

Feature engineering involves finding connections between variables and packaging them into a new single variable is called. Deep Learning performs automated feature engineering. Automated feature engineering is the defining characteristic of Deep Learning especially for unstructured data such as images. This matters because the alternative is engineering features by hand. This is slow, cumbersome and depends on the  domain knowledge of the people/person performing the Engineering

 

Deep Learning suits problems where the target function is complex and datasets are large but with examples of positive and negative cases.  Deep Learning also suits problems that involve Hierarchy and Abstraction.

Abstraction is a conceptual process by which general rules and concepts are derived from the usage and classification of specific examples. We can think of an abstraction as the creation of a ‘super-category’ which comprises of the common features that describe the examples for a specific purpose but ignores the ‘local changes’ in each example.  For example, the abstraction of a ‘Cat’ would comprise fur, whiskers etc. For Deep Learning, each layer is involved with detection of one characteristic and subsequent layers build upon previous ones.  Hence, Deep Learning is used in situations where the problem domain comprises abstract and hierarchical concepts. Image recognition falls in this category. In contrast, a Spam detection problem that can be modelled neatly as a spreadsheet probably is not a complex problem to warrant Deep Learning

A more detailed explanation of this question can be found in THIS Quora thread.

AI vs. Deep Learning vs. Machine Learning

Before we explore types of AI applications, we need to also discuss the differences between the three terms AI vs. Deep Learning vs. Machine Learning.

The term Artificial Intelligence (AI) implies a machine that can Reason. A more complete list or AI characteristics (source David Kelnar) is

  1. Reasoning: the ability to solve problems through logical deduction
  2. Knowledge: the ability to represent knowledge about the world (the understanding that there are certain entities, events and situations in the world; those elements have properties; and those elements can be categorised.)
  3. Planningthe ability to set and achieve goals (there is a specific future state of the world that is desirable, and sequences of actions can be undertaken that will effect progress towards it)
  4. Communication: the ability to understand written and spoken language.
  5. Perception: the ability to deduce things about the world from visual images, sounds and other sensory inputs.

 

The holy grail of AI is artificial general intelligence (aka like Terminator!) that allows machines to function independently in a normal human environment. What we see today is mostly narrow AI (ex like the NEST thermostat). AI is evolving rapidly. A range of technologies drive AI currently. These include: image recognition and auto labelling, facial recognition, text to speech, speech to text, auto translation, sentiment analysis, and emotion analytics in image, video, text, and speech. Source: Bill Vorhies  AI Apps  have also reached accuracies of 99% in contrast to 95% just a few years back.

 

Improvements in Deep Learning algorithms drive AI.  Deep Learning algorithms can detect patterns without the prior definition of features or characteristics. They can be seen as a hybrid form of supervised learning because you must still train the network with a large number of examples but without the requirement for predefining the characteristics of the examples (features). Deep Learning networks have made vast improvements both due to the algorithms themselves but also due to better hardware(specifically GPUs)

 

Finally, in a broad sense, the term Machine Learning means the application of any algorithm that can be applied against a dataset to find a pattern in the data. This includes algorithms like supervised, unsupervised, segmentation, classification, or regression. Despite their popularity, there are many reasons why Deep learning algorithms will not make other Machine Learning algor…

12 types of AI problems

With this background, we now discuss the twelve types of AI problems.

1) Domain expert: Problems which involve Reasoning based on a complex body of knowledge

This includes tasks which are based on learning a body of knowledge like Legal, financial etc. and then formulating a process where the machine can simulate an expert in the field

2) Domain extension: Problems which involve extending a complex body of Knowledge

Here, the machine learns a complex body of knowledge like information about existing medication etc. and then can suggest new insights to the domain itself – for example new drugs to cure diseases.

3) Complex Planner: Tasks which involve Planning

Many logistics and scheduling tasks can be done by current (non AI) algorithms. But increasingly, as the optimization becomes complex AI could help. One example is the use of AI techniques in IoT for Sparse datasets  AI techniques help on this case because we have large and complex datasets where human beings cannot detect patterns but a machine can do so easily.

4) Better communicator: Tasks which involve improving existing communication

AI and Deep Learning benefit many communication modes such as automatic translation,  intelligent agents etc

5) New Perception: Tasks which involve Perception

AI and Deep Learning  enable newer forms of Perception which enables new services such as autonomous vehicles

6) Enterprise AI: AI meets Re-engineering the corporation!

While autonomous vehicles etc get a lot of media attention, AI will be deployed in almost all sectors of the economy. In each case, the same principles apply i.e. AI will be used to create new insights from automatic feature detection via Deep Learning – which in turn help to optimize, improve or change a business process (over and above what can be done with traditional machine learning). I outlined some of these processes in financial services in a previous blog: Enterprise AI insights from the AI Europe event in London. In a wider sense, you could view this as Re-engineering the Corporation meets AI/ Artificial Intelligence. This is very much part of the Enterprise AI course

 

7) Enterprise AI adding unstructured data and Cognitive capabilities to ERP and Datawarehousing

For reasons listed above, unstructured data offers a huge opportunity for Deep Learning and hence AI.  As per Bernard Marr writing in Forbes:  “The vast majority of the data available to most organizations is unstructured – call logs, emails, transcripts, video and audio data which, while full of valuable insights, can’t easily be universally formatted into rows and columns to make quantitative analysis straightforward. With advances in fields such as image recognition, sentiment analysis and natural language processing, this information is starting to give up its secrets, and mining it will become increasingly big business in 2017.” I very much agree to this. In practise, this will mean enhancing the features of ERP and Datawarehousing systems through Cognitive systems.

8) Problems which impact domains due to second order consequences of AI

David Kelnar says in The fourth industrial revolution a primer on artificial intelligenc…

“The second-order consequences of machine learning will exceed its immediate impact. Deep learning has improved computer vision, for example, to the point that autonomous vehicles (cars and trucks) are viable. But what will be their impact? Today, 90% of people and 80% of freight are transported via road in the UK. Autonomous vehicles alone will impact: safety (90% of accidents are caused by driver inattention) employment (2.2 million people work in the UK haulage and logistics industry, receiving an estimated £57B in annual salaries) insurance (Autonomous Research anticipates a 63% fall in UK car insurance premiums over time) sector economics (consumers are likely to use on-demand transportation services in place of car ownership); vehicle throughput; urban planning; regulation and more. “

 

9) Problems in the near future that could benefit from improved algorithms

A catch-all category for things which were not possible in the past, could be possible in the near future due to better algorithms or better hardware.  For example, in Speech recognition, improvements continue to be made and currently, the abilities of the machine equal that of a human. From 2012, Google used LSTMs to power the speech recognition system in Android. Just six weeks ago, Microsoft engineers reported that their system reached a word error rate of 5.9% — a figure roughly equal to that of human abilities for the first time in history.  The goal-post continues to be moved rapidly .. for example loom.ai is building an avatar that can capture your personality

10) Evolution of Expert systems

Expert systems have been around for a long time.  Much of the vision of Expert systems could be implemented in AI/Deep Learning algorithms in the near future. If you study the architecture of IBM Watson, you can see that the Watson strategy leads to an Expert system vision. Of course, the same ideas can be implemented independently of Watson today.

 

11) Super Long sequence pattern recognition

This domain is of personal interest to me due to my background with IoT see my course at Oxford University Data Science for Internet of Things. I got this title from a slide from Uber’s head of Deep Learning who I met at the AI Europe event in London. The application of AI techniques to sequential pattern recognition is still an early stage domain(and does not yet get the kind of attention as CNNs for example) – but in my view, this will be a rapidly expanding space. For some background see this thesis from Technische Universitat Munchen (TUM) Deep Learning For Sequential P…  and also this blog by Jakob Aungiers   LSTM Neural Network for Time Series Prediction

 

12) Extending Sentiment Analysis using AI

The interplay between AI and Sentiment analysis is also a new area. There are already many synergies between AI and Sentiment analysis because many functions of AI apps need sentiment analysis features.

“The common interest areas where Artificial Intelligence (AI) meets sentiment analysis can be viewed from four aspects of the problem and the aspects can be grouped as Object identification, Feature extraction, Orientation classification and Integration. The existing reported solutions or available systems are still far from being perfect or fail to meet the satisfaction level of the end users. The main issue may be that there are many conceptual rules that govern sentiment and there are even more clues (possibly unlimited) that can convey these concepts from realization to verbalization of a human being.” source: SAAIP

Notes: the post The fourth industrial revolution a primer on artificial intelligenc…  also offers a good insight on AI domains also see #AI application areas – a paper review of AI applications (pdf)

 

Conclusion

To conclude, AI is a rapidly evolving space. Although AI is more than Deep Learning, Advances in Deep Learning drive AI. Automatic feature learning is the key feature of AI. AI needs many detailed and pragmatic strategies which I have not yet covered here. A good AI Designer should be able to suggest more complex strategies like Pre-training or AI Transfer Learning

AI is not a panacea. AI comes with a cost (skills, development, and architecture) but provides an exponential increase in performance. Hence, AI is ultimately a rich company’s game. But AI is also a ‘winner takes all’ game and hence provides a competitive advantage. The winners in AI will take an exponential view addressing very large scale problems i.e. what is possible with AI which is not possible now?

We cover this space in the  Enterprise AI course 

Protected: PredictionV

This post is password protected. To view it please enter your password below:

Meet me at AI-europe in London (Uber, Nvidia, Kayak, UBS, Bell Labs + others speaking)

I am at Ai-europe next week. It should be a great event where Uber, Nvidia, Kayak, UBS, Bell Labs  are speaking

I am very much looking forward to the Nvidia talk (I work with Nvidia for my course which I teach at Oxford University Data Science for Internet of thing)

Also, look forward to the following talks. I believe that the event is almost full but there are only a last few places left. see more at Ai-europe 

  •  Opening Speech De-mystifying AI Terry JONES Founding Chairman KAYAK
  • AI as a game–changer for every industry: disruption now and perspectives for 2025 Robin BORDOLI Chief Executive Officer CROWDFLOWER
  • Serge PALARIC Vice President EMEA Embedded & OEMs NVIDIA – DEPLOYING DEEP LEARNING EVERYWHERE: CUTTING-EDGE RESEARCH TEAMS, HYPER-SCALE DATA CENTERS, ENTERPRISES USING AI
  • CONTACT CENTERS: HOW ARTIFICIAL INTELLIGENCE IS REVOLUTIONIZING THE CUSTOMER EXPERIENCE  Dr Nicola J. MILLARD Head of Customer Insight & Futures BT GLOBAL SERVICES
  • Banking: why UBS is interested in AI and other fintech innovations Annika SCHRÖDER Director UBS Group Innovation UBS AG
  • Health: the value of integrating deep learning – Use case: applying deep learning in devices to diagnose cancer – Carlos JAIME – Head of Health & Medical Equipment Division SAMSUNG ELECTRONICS FRANCE
  • BRINGING MACHINE LEARNING TO EVERY CORNER OF YOUR BUSINESSLuming WANG Head of Deep Learning UBER
  • Augmented Reality Danny Lopez COO Blippar
  • Virtual assistants: their impacts on  the Internet and the society – Why AI-based digital assistants will contribute to revolutionize the Internet and place technology at the service of humans Julien HOBEIKA – Juliedesk
  • IMAGE ANALYSIS: RESEARCH AND ITS APPLICATIONS IN THE REAL WORLD – Miriam Redi – Bell Labs