Hierarchies vs networks – Why networks matter and hierarchies will crumble ..- The omega point of OpenGardens

Hierarchies and networks

I have been saying this in many forms over the years .. but still worth re-emphasizing since it’s the philosophical foundation of OpenGardens

What exactly is a network? And why are networks so special?

Networks are all around us. But their effects are less well understood because  in our daily lives, we are used to hierarchies; for example, in the organization of offices and institutions, we still see hierarchical structures. Hierarchies are the opposite of networks. While hierarchies will not be replaced by networks in all cases, but  already, through the Internet, we are seeing networks assert their strength in many aspects of Life.

Networks have a subtle but disruptive impact. Global warming is a good example of a network level change. Here, by ‘network level change’, we mean that events leading to global warming are interconnected, but their impact is felt only over a long period of time and is felt separately from the change that triggers it.

For instance, you cannot know by how much exactly the ozone layer will change for every plastic bag that you fail to recycle, but most people would agree that the environment is impacted  for every such plastic bag that ends up on the ocean floor.

On first impression, networks are not special in any way.

A network is simply a collection of links between units (also called nodes).

Networks exist at multiple levels: global, societal (country), group (office), and individual.

Every unit within a network can be seen as a closed system.

Closed systems interact in predetermined ways.

When a network connects more than two closed systems, their interaction is no longer predetermined.

This could be seen as ‘opening up’ the system.

The system has now gone from a closed system to an open system.

Open systems interact in unknown, radical ways.

All closed systems have a natural propensity to find new connections which cause them to ‘open up’.

What happens when networks open up and how do networks evolve?

But what happens when systems open up? That is, how do networks evolve? This is a complex question.

You can study the propensity of a system to change in two ways: as a biological system or as a mathematical system.

From a biological perspective, a system evolves to survive and to grow. First there is an initial interaction. From that interaction comes variation―the system changes and adapts. Over time, there is selection and retention―the best qualities are adopted and retained.

This approach is basically along the lines of Darwin’s natural selection theories.

From a mathematical perspective, networks evolve by creating order out of chaos.

How does order appear in a network? Without going into the mathematics, all parts of the system appear to communicate with all other parts purely by local interactions.

In general, a system comprises a set of interacting or independent entities that form an integrated whole.

When we speak of a system, we also define a boundary―the external context within which the system exists.

Entities within a system interact with one another (within the boundaries of the system) but can also interact with entities from outside the system’s boundaries.

An open system continuously interacts with its environment.

In doing so, it evolves and grows based on external input.

In contrast, a closed system does not get external feedback and does not evolve.

Breakdown of hierarchies are related to networks and open systems, which lead to connections, and more connections lead to more social interactions and to a “step change” in the body of knowledge.

This step change is exponential and disruptive – powered by the Internet

That’s why freedom of the Internet matters

Clustering: More than connecting friends – An amplification of ideas

When left to themselves, networks have a tendency to “cluster” because two elements connected to a common third element are more likely to establish links among themselves, leading to clusters.

This leads to phenomena like six degrees of separation. “Six degrees of separation…refers to the idea that everyone is [at most] six steps away from any other person on Earth, so that a chain of ‘a friend of a friend’ statements can be made…to connect any two people in six steps or fewer.”

Thus, networks can potentially connect friends, and these human factors offer a bigger reason for the success of social networks.

But networks do more than ‘connecting friends’, networks propagate and amplify ideas.

Places that lie at the crossroads are a hub of new ideas simply because they ‘connect people’.

Consider the case of the ancient mummies found in the Tarim Basin.

The Tarim Basin  is located in the far western region of China.

Surrounded by inhospitable mountains and deserts, the Tarim basin is a vast, arid micro-continent and may have been one of the last places in Asia to be inhabited because its aridity required that technology for water transport and storage be developed before people could live there.

However, ancient DNA from mummies found there suggests that a culturally rich and interrelated population of Western, Eurasian, and Asian people had lived here since the early Bronze Age.

If this region was so arid and inhospitable, why did people choose to live there, intermingle and thrive in such a hostile environment?

Despite its bleakness, the proximity of the Tarim basin to the ancient Silk Road was the main reason for its cultural development.

Thus, living at a crossroads is good for the creation of new ideas no matter how hostile  the surroundings.

With networks, we no longer need geographical hubs – we have social hubs and these social hubs are far more fluid, dynamic, global and disruptive to control 

So, why are networks important?

Simply because networks lead to Open systems.

Open Systems lead to a breakdown of hierarchy and this impacts society broadly.

In a world in which hierarchies break down, we see  a phase of creative destruction which manifests itself in the liberalization of society.

 

 

Tarim basin – Image source – wikipedia

The liberalization of society – The cultural impact of networks and social networks

The rise of networks and the liberalization of society go together.

As networks proliferate, society becomes more liberal.

Because ideas and networks know no boundaries and they have a tendency to open up closed ecosystems, their effects are global.

The flow of information and connections breaks down hierarchies and questions the blind following of authority.

As connections are formed globally and contradictory views are shared and discussed, we will question many forms of authority and structure in society that we have taken for granted in the past for instance, governance, religion, Identity(to which groups we affiliate ourselves and the creation of a global identity) and spirituality.

Thus, networks have a disruptive effect.

They topple existing frameworks most of which are based on existing hierarchies .

In many cases, existing frameworks and hierarchies are often a result of an older power struggle that has played out, and the results of which are now  maintained often through force.

Networks disrupt that status quo.

In this sense, networks can be good for humanity and we will see networks bring about even more creative destruction in future.

The relationship is symbiotic. The more we use networks and grow, the more the network is enriched.

Dictators and guns will go the way of the Dodo ..

Why this matters – The creation of a global Identity

Why this matters? –

I would very much urge you to listen to this brief, poignant recording from ‘Sara’ in Libya.

Most people in free societies can relate to this young woman …  and that’s why liberalization of societies and evolution of networks matter at a human level. – We’re not living like humans

I would even go so far as to say that the empathy and support at individual levels through social networks is far more significant than that from governments and that over time, as we relate to people like Sara and networks connect us, a new global Identity will emerge ..

And that’s the Omega point of OpenGardens

My Oxford University course – Big Data for Telecoms

 

 

 

My Oxford University Course – Big Data for Telecoms is now announced. Have a look. Its a key topic going forward and I will include survey results I am conducting covering Use cases, Analytics and Vendor Strategies for Big Data and Telecoms

You will get a certificate of completion from Oxford Uni. My objective is to distinguish the unique elements of Telecoms for Big Data and ultimately define the unique aspects of the role of a Data Scientist for the Telecoms industry

Big Data for Telecoms is a one-day course which provides a strategic foundation for the application of Big Data principles to the Telecoms ecosystem.

The key elements covered in this course are

  1. Uniquely tailored for the Mobile / Telecoms industry including Internet of Things
  2. Based on an ongoing survey (Case studies, Analytics and Vendor strategies) for Big Data and Telecoms
  3. An understanding of Big Data analytics especially geared to Mobile
  4. Understanding the role of the Mobile Data Scientist
  5. Policy and regulatory issues and the challenges and opportunities they provide
for a more detailed outline see  Oxford University Course – Big Data for Telecoms

 

NFC Ring – Liberating NFC from the Operators and Apple – the kickstarter way ..

 

 

 

 

 

 

 

 

 

 

Update - I have updated this blog to also include Apple

I just backed this project on kickstarter – NFC Ring created by @John McLear in the UK

As feynlabs (my edtech startup), we completed a funding round successfully on Kickstarter – Computer Science for your Child and we support cool innovation from the grassroots community at Kickstarter

But wearing a Mobile hat, this project (NFC Ring) is a great idea

NFC (Near Field Communication) has been too closely tied by the operators to SIM ..

Nothing wrong in that per-se BUT

NFC has far greater applications than linked to payments

And it’s not working .. NFC payment volume forecast revised downward by more than 40pc: Gartner

Meanwhile, Apple plays it’s own game with zero interest in NFC – NFC Stands For Nobody F****** Cares

Nor will VCs support any startups in this space for NFC and Payment solutions like Square are catching up fast

In other words, NFC  is an open technology which works at the moment but existing players are trying to create an ecosystem which they control (Apple and Operators)

Kickstarter projects like NFC Ring liberate NFC and bring it back to a simpler (but more ubiquitous) set of applications. 

The system is open sourced as well .. hence a platform for others to create innovative things

Have a look .. Its a simple and a cool idea ..

 

Welcoming Irina Gray as deputy editor of OpenGardens blog ..

Over the years, (starting 2005!) this blog has evolved alongwith my personal work

Earlier this year, I launched my edtech startup feynlabs (still in alpha for the rest of this year – as we develop content based on trials)

Also, as you can see from the posts here (on Big Data for Telecoms ) – futuretext (ie my existing work) will be increasingly focused on Big Data for Telecoms
In that context, we will blog far more extensively about Big Data for Telecoms here and my Oxford University course will also be aligned to the theme of Big Data for Telecoms
So, to manage the extra focus on Big Data for Telecoms on the Open Gardens blog, I am pleased to welcome Irina Gray as the deputy editor of Open Gardens blog
Based in Scotland but Russian by ethnic origin, Irina was born and grew up in Turkmenistan – which used to be part of the Soviet Union and is now an independent country.
In 1996 she graduated from a university back in Turkmenistan with a bachelor’s degree in English and literature.
Later on she went on to work in investment industry and got her master’s degree in international investment and financial studies from Heriot-Watt University in Edinburgh, Scotland, in 2000.
Since 2003, in various capacities, she has been working with me in my previous ventures
Irina loves networking with people around the world, which also opens up some incredible opportunities for personal and business development.
Irina’s other interests include the environment, raw foods, nutritional therapy and natural health.
Irina currently live in Scotland with her husband and son.

Evolving the definition of Computational thinking


 

 

 

 

 

 

 

 

 

 

 

 

 

Please follow @feynlabs to know more about our work at Feynlabs

Abstract

In this article, I explore how the definition of Computational Thinking could evolve.  As Computer Science is introduced in schools, it is tempting to confine it to the ‘curriculum’. However, doing so would miss the larger point of how Computer Science could truly benefit innovation and economies.  The article intentionally takes an aspirational view of Computer Science and Computational thinking.

Background – Computer Science and Computational thinking

Computer Science is based on the idea of Computational Thinking. Computational thinking i.e. ‘thinking like a computer scientist’ involves breaking down a problem logically and algorithmically to provide an optimum solution

The use of the word ‘Computer’ in Computer Science gives the mistaken perception that Computer Science is the study of Computers. However, Computer science can be better understood as a process of problem solving using Computers. Thus, while Computers play an important part in understanding Computer Science, the key skill of a Computer Scientist lies in the ability to solve problems using Computers

This definition (Computer Science = Based on Computational Thinking = The process of problem solving using Computers) has some implications.

1)      Computer Science is like mathematics. By that, we mean that the problems addressed by Computer Science are often in other Scientific and Technical domains (ex weather forecasting).

2)      Also, Computer Science is related to tool making (creating the computing tools and platforms – as opposed to merely using them)

This leads to yet other questions:

1)      What type of problems can be solved using Computers?

2)      What do we mean by ‘solving’ the problem?

In a nutshell, Computer science is concerned with solving problems that can be defined using an algorithm

In the simplest sense, an algorithm is a step by step set of instructions to solve any instance of a problem. By implication, this means that not all problems can be solved by a Computer.  A problem that can be solved by a Computer should be ‘Computable’ i.e. an algorithm must exist for solving it.

Based on the early seminal work done by Jeanette Wing in 2006/2007 – the term computational thinking is used to describe people and computers working together to solve problems and accomplish tasks.

The processes can be executed either by a human or by a machine. Computational thinking (when seen as humans and computers working to solve complex problems) – allow us to solve much more complex problems – for example Genome sequencing. See many more examples in the long version of Jeanette Wing’s presentation. Jeanette Wing believes that Computational thinking is a fundamental skill for everybody, not just for Computer scientists. We should add Computational Thinking to reading, writing, and arithmetic to improve every child’s analytical ability.

 

Evolution of Computational Thinking

It could be tempting to reduce computational thinking to just another subject to be taught in schools. However, if we take a more aspirational viewpoint (again the examples here are very interesting long version of Jeanette Wing’s presentation)  – then the interplay between humans and computers will change the behaviour of both.

A more sweeping definition of Computational Thinking would call for new skills, new ways of thinking and make a radical change to the economies who adopt these principles.

The intriguing question is:

Over the years, how will the thinking of the learners (students) evolve? 

and

Could we end up addressing even more complex problems? i.e. Could Computer Science cause a virtuous cycle towards greater innovation if Computational thinking allows us to address increasingly complex problems over the generations?

A change in our thinking is already happening – for instance studies show that the Internet is changing the way in which humans use memory

So, as humans and Computers collaborate through Computational thinking to solve increasingly complex problems – some tasks will be done better by humans, others better by Computers.

The trick will be in identifying the two domains and combining them to solve complex future problems.

Here are three ways I believe Computational Thinking will evolve

a)      The Computer as a prosthetic/ tool building ability of Computational thinking

b)      Collective intelligence – Network based intelligence

c)      Network based collaboration

 1)     Tool building and Prosthetics

 

The first scene from Space Odyssey 2001 shows a group of early hominids who learn tool building and thereby master their environment. If we extend this idea then we can say that – Humans have always been good tool builders – and now using Computational thinking – we are extending this idea to create a whole new class of tools

In this model, we could think of the Computer as an auxiliary brain—as a type of prosthesis. All tools/appliances augment the human body – ex microscope to see better.

Computational thinking can be thought of as using Computers to augment human brain. This concept has been proposed by number of thinkers since the 1980s

2)     Collective intelligence – network based intelligence

The idea of Collective intelligence has been proposed by thinkers like Pierre Levy and others.

Collective intelligence or Collective IQ is shared or group intelligence that emerges from the collaboration and competition of many individuals and appears in consensus decision making. The term appears in sociobiology, political science and in context of mass peer review and crowdsourcing applications. It may involve consensus, social capital and formalisms such as voting systems, social media and other means of quantifying mass activity.  (Wikipedia)

A more intellectually radical view of Collective Intelligence can be seen as an antidote to capitalism – a return to a more simpler means of working together independent of the desire to make money

 Lévy’s visionary anthropology is therefore diametrically opposed to that of the Californian ideologues. Instead of forming a    perfect market, the Net opens the space of Knowledge. Crucially, this new space is completely distinct from the space of the Commodity. When we are on-line, we want to learn, play and communicate with one another rather than to make money. Above all, we want to participate within the “collective intelligence” because we suffer from individual alienation caused by capitalism.

 

3) Network based collaboration

Finally, network based collaboration i.e. the idea of ‘Net Smart’ as postulated by Howard Rheingold.

I reviewed Howard’s book previously – Net Smart – a book review

The book is about – How to use social media intelligently, humanely and mindfully.

Howard sees the ability to engage with cyberculture as a core skill – much like driving a car for the current generation and he proposes that it is not an automatic skill

While we all engage with social media in one way or the other, it is a skill that can be improved.

Further, he sees a time lag between the technology and the social revolution (ex: there was a time lag between print and the social impact due to widespread availability of books).

In that sense, we are living ‘in the time lag’ and the changes that are happening around us will be apparent only in retrospect.

Howard believes that the skill of digital literacy can make a difference between being empowered or manipulated – being serene or being frantic. Furthermore, he sees competency in engaging with cyberspace as a two-fold skill – i.e. the technical competency of using the tools and also the social interaction of engaging with others.

There is a very insightful statement from Howard early on in the book which says that if he were to reduce the essence of homo-sapiens in one sentence it would be “People create new ways to communicate, then use their new media to complicated things together”

It is this ability of ‘creating new ways to communicate and then using that new media to make complicated things together’ – which could be a new way of Computational thinking

Conclusions

Shuchi Grover says in Learning to Code is not enough – Scienceis  changing in a subtle but fundamental way--from the use of computing to support scientific work, to integrating Computer Science (CS) concepts and tools into the very fabric of science.

So, in this blog, we explore a deeper meaning of not how Computers will evolve but how human thinking and behaviour could evolve – which is an extension of Computational thinking (process of problem solving using Computers)

Acknowledgements – many thanks to Eelco Dijkstra who I met at Vrije Universitit in Amsterdam for some of the inspiration behind this post and also for recommending the book Herbert Simon – Sciences of the artificial

Please follow @feynlabs to know more about our work at Feynlabs

Big Data and Telecoms – Eleven reasons why Big Data for Telecoms is different..

 

 

 

 

 

 

 

 

I have been working on these ideas over the past few posts

In this blog,  I present eleven reasons why the idea of Big Data for Telecoms is different from conventional Big Data

These ideas are part of my forthcoming Oxford University course on Big Data and Telecoms and also my next book Mobile Data Scientist

Today, there is genuine change in the Telecoms ecosystem.

To paraphrase Dylan – the times they are a changin’ because for the first time – Telecom Operators on both sides of the Atlantic are less constrained by regulations that hampered them (as opposed to Web companies) See What happens when Telecoms Operators can profit from the Data they hold

Also, so far, the idea of Big Data has been driven by the Web ..

So, the proposition is: When it comes to Telecoms/Mobile – there are additional considerations.

Here’s why ..

1)      The Harvard business review says that the role of the Data scientist will be one of the hottest roles going forward. How will the role of Data scientist differ for the Telecoms / Mobile ecosystem?

2)      Increasingly – Telecoms / Mobile will need to have the ability to handle real time data and many ideas taken from the web will not apply to Telecoms (ex see How do IOT appstores differ from conventional appstores )

3)      IOT and Big Data is a specific subset of Telecoms –and it will  need some unique considerations – (ex Why interoperability is critical to making the Internet of Things work)

4)      The taxonomy of Telecoms data is unique. In other words, the future Telecoms network would be able to see many facets of their customer through data categories which are only now beginning to manifest. More on this in a subsequent post

5)      Further to the above point, augmenting telecoms data with social data will be a key element of Big Data going forward

6)      The assisted web will be different for Telecoms – the best example of this is ‘Google now’ for Telecoms but with physical data overlaid

7)      Big data analytics for Telecoms – will be based on classic Big Data algorithms such as predictive algorithms, machine learning algorithms etc but specific considerations for these algorithms will apply based on datasets and Telco domain knowledge

8)      Secondary uses of data sets and data clusters – many datasets will have secondary uses and will be hence potentially monetizable. For example – an analogy is – Bus routes which could be used to create a system to indicate when the next bus is due (which is a secondary use of the route dataset)

9)      Other Industries will become data enabled and that means more datasets could be merged/amalgamated to create new insights. Last week, I saw a presentation from the     Research Data Alliance which implements the technology, practice, and connections that make Data Work across barriers. The Research Data Alliance aims to accelerate and facilitate research data sharing and exchange. Such initiatives will be more common and bring new industries together providing an opportunity to leverage Telecoms data

10)   Telco APIs .. may finally find an area they could be used in!

11)   The idea of Nappies and Beer .. could have a more specific impact for Telecoms when Real time and social media data are merged with Telco data

These ideas bring together many elements from the Telco and the Web world – which exist in isolation – but can provide new value in amalgamation. The timing is right and they are already happening (AllAboard: a system for exploring mobility and optimizing transport in developing countries using cellphone data)

If you would like to be added to our mailing list on Big Data – please email me at ajit.jaokar at futuretext.com

PS – I am writing a paper on Internet of Things and Big Data at the Internet of Things Mashup day in Oxford. If you would like a copy please tweet @webinosproject and #iotmashup

 

How do IOT appstores differ from conventional appstores ..

 

 

 

 

 

 

 

How do IOT appstores differ from conventional appstores?

Conventional appstores add commercial features like packaging, discovery, monetization etc but do not extend to the sense-compute-actuate paradigm.

IOT appstores differ from traditional appstores for two reasons

a) Their reliance on multiple sources of sensor based open data and

b) The need of actuating devices in (almost) real time.

Conventional appstores do not need data from a variety of sources.Nor do they need to immediately actuate.

IOT devices may also send data about themselves periodically, on demand or triggered by an event.

The IOT service would thus need to “sense – compute and actuate” in almost real time.

Sense Compute Actuate is a well known idea in Digital control systems but not in conventional appstores

Image source – Drexel.edu

I will be referring to this in my new book on the Mobile Data Scientist and my Oxford University course on Big Data for Telecoms

Big Data fundamentalism

 

 

 

 

 

 

 

 

 

 

 

I like this contra thinking .. about the idea of Big Data fundamentalism 

With so much hype around Big Data .. its easy to think that Big Data is the truth

Not so ..

To summarize the 6 myths of Big Data by Kate Crawford of Microsoft research 

I will be referring to this in my new book on the Mobile Data Scientist and my Oxford University course on Big Data for Telecoms
Myth 1: Big Data is New
Myth 2: Big Data Is Objective
Myth 3: Big Data Doesn’t Discriminate
Myth 4: Big Data Makes Cities Smart
Myth 5: Big Data Is Anonymous
Myth 6: You Can Opt Out

Paper on Internet of Things and Big Data + Internet of Things mashup day .. On Tuesday ..

I will be at this day .. which promises to be a fun day at Oxford ..

Based on this event, I am writing a paper on Internet of Things and Big Data part of my Oxford Uni course on Big Data and Telecoms

If you would like a copy of this paper – pls reply on twitter to @webinosproject  and #iotmashup

 Internet of things Mash-up Day 23rd of July, University of Oxford webinos Foundation

Tuesday July 23 2013 at 9:30 am – Wednesday July 24, 2013 at 5:00 PM at Oxford United Kingdom

It will be an interesting day to learn about integrating IOT devicesInternet of things Mash-up Day 

From Landsgemeinde to Policy 3.0 – Paper review – The Futurium—a Foresight Platform for Evidence-Based and Participatory Policymaking

Introduction

This blog post is a detailed review/notes of a paper called The Futurium—a Foresight Platform for Evidence-Based and Participatory Policymaking Franco Accordino by Franco Accordino

(Note this post is part of a related twin post An analysis of Internet Policy trends over the last six months and their implications for technology policy)

A couple of weeks ago, I attended the ESPAS  workshop  in Brussels (Scientific and Technological Futures and Policy Challenges) on invitation of   MEP James Elles

I read the paper at that workshop. The paper talks of a platform called futurim which is the topic of the paper review also.

The paper ends with reference to the vision of the Pierre Teilhard de Chardin – for a supreme point (Omega Point) towards which the universe is constantly developing to reach increasingly higher levels of consciousness and unity.  And says that “As a consequence, policymaking and decisions will, in the future, be increasingly taken in a collective way.”

I have yet to see a policy paper refer to ideas of interesting philosophers like Teilhard and that got my attention! (If interested in these things, I would recommend you read the book The Phenomenon of Man )

More to the point, I very much agree to the overall idea of Collaboration, Collective Intelligence (ex Collective Intelligence by Pierre Levy) and the need for a new form of decision making.

Why this matters?

As I have said in the related blog (An analysis of Internet Policy trends over the last six months and their implications for technology policy) – conversations on the Internet are based on cool/sexy issues often dominated by Tech.

This is only part of the picture.

Looking from a policy perspective, we need a comprehensive and rational way to widen the policy conversation.

Why should we widen the Policy conversation?

Here are two reasons:

1)      Direct democracy – Landsgemeinde – There is a (now diminishing) tradition in some Swiss cantons called Landsgemeinde.  The Landsgemeinde or “cantonal assembly” is one of the oldest forms of direct democracy. Eligible citizens of the canton meet on a certain day in the open air to decide on laws and expenditures by the council. Everyone can debate a question. Voting is accomplished by those in favour of a motion raising their hands. Image Landsgemeinde in Appenzell image source NBC news

Could we bring the ethos of participatory democray(aka Landsgemeinde – to the age of the Web and beyond)

2)      The argument for inclusive institutions – Why nations fail – There is an interesting book I am reading called “Why nations fail”. A book review of Why Nations fail from  the guardian has some key segments that show why we need inclusive institutions

Their argument is that the modern level of prosperity rests upon political foundations. Proximately, prosperity is generated by investment and innovation, but these are acts of faith: investors and innovators must have credible reasons to think that, if successful, they will not be plundered by the powerful.

For the polity to provide such reassurance, two conditions have to hold: power has to be centralised and the institutions of power have to be inclusive. Without centralised power, there is disorder, which is anathema to investment.

So, if inclusive institutions are necessary, how do they come about? Again, Acemoglu and Robinson are radical. They argue that there is no natural process whereby rising prosperity in an autocracy evolves into inclusion. Rather, it is only in the interest of the elite to cede power to inclusive institutions if confronted by something even worse, namely the prospect of revolution. The foundations of prosperity are political struggle against privilege.

Futurium and Policy making 3.0

The Futurium platform – The  paper presents the Futurium platform which was initially developed with the purpose of hosting and curating visions and policy ideas. The platform has evolved into an experiment for Policy makers to engage more widely which is referred to as ‘Policy Making 3.0’. The idea of Policy Making 3.0 is in a nascent and evolutionary stage.

The Futurium platform distinguishes between different variables, reflecting the emotional  vs. rational mindsets of the participants, and offers the possibility to frame the engagement and co-creation process into multiple phases of a workflow.

Futurium is an early prototype implementation of the Policy Making 3.0 model, which is a long-term vision requiring further investigation and experimentation.

The keywords give an indication of the approach (especially evidence based policy making) – Policy making 3.0,  Foresight ,  Futures ,  Participatory policymaking , Crowdsourcing , Collective emotional intelligence , Collective rational intelligence, Evidence-based policy making ,  Data mining,  Social media , Complex systems , Digital futures .

Rationale – Public policies need to be continuously reviewed and adapted to deal with unforeseen issues or to react to emergency situations such as coping with the consequences of the on-going systemic crisis. Rapidly evolving socio-political contexts exert influence on policymakers who have to take decisions more quickly and accurately than in the past.  Very often, they have no other choice but to react to emergencies.  Hence, more than ever, there is a growing need to improve forward thinking in policymaking practices. New policies are often thought up on the basis of current trends rather than by capturing future opportunities offered, for instance, by long-term advances in science and technology.

Short term vs Long term – The need to focus on short-term measures often prevents governments and businesses from orientating their policy choices towards future possibilities,partly because they have been elected to come up with tangible responses to current challenges that matter to citizens and partly because long-term investment decisions may be too risky. This may make it difficult to put in place sustainable solutions to structural problems.

Two tradeoffs and balances: How can these shortcomings in current policymaking be overcome? The challenges can be articulated along two main axes, highlighting typical tensions between different policymaking mindsets: 1. Evidence about the status of the real world vs. inspiration from longer-term thinking 2. Delegated leadership vs. participatory leadership

Evidence vs. Futures – Our current ability to gain insight into the status of the real world (individuals,society, economy, environment, etc.) makes it possible to inform policy decisions more successfully than in the past.

Representative vs. Participatory Leadership  The advent of social networks has opened up new prospects for policymaking. They give a voice to everyone and allow people to organise themselves into groups and ultimately contribute to policy debates at local, national and international levels. Today, in principle, citizens could be empowered to co-decide on issues that matter to them by transposing well-established direct democracy tools such as referenda into the virtual space.

Trust , scale of e-democracy tools – however, still a long way to go due, for instance, to the unresolved issues of trust in and the security of the underlying IT infrastructures as well as  identity management. Another challenge is the fact that e-Democracy tools are now widely available and have not been taken up by all citizens. Social media can, however, still be used to improve the links between policymakers and stakeholders to take a more participatory approach to the design of future policies. Brainstorming and engagement techniques such as ‘the art of participatory leadership’4, traditionally used in in-person workshops,could be transposed into the virtual space to engage (potentially) all citizens in policymaking.

Policy Making 3.0 – Policy Making 3.0 is a participatory and evidence-based model designed to provide an answer to the above challenges. It is based on the metaphor of a ‘collective brain’ (or emerging collective intelligence) according to which stakeholders and policymakers form a social network to co-design policies on the basis of two distinct factors:  can be considered as the ‘emotional and imaginative’ contribution of the participants to the policy (the ‘right hemisphere’ of the social network’s brain). Policy Making 3.0 scales up the metaphor of the ‘left and right brains’ to the social network to make current policymaking processes more evidence based, participatory, transparent and agile.

A common vocabulary – The paper then goes on to define a common set of vocabulary as a foundation for discussion. This include the terms Vision, Trend ,  Future, Desirability, Likelihood,  Policy, Impact,  Plausibility, Support

Proposed three layers in the co-creation process – The co-creation process then consists of the three layers  1. Futures (what we all want to achieve) 2. Policies (how to underpin the futures) 3. Agents (who executes the policies)

The Futurium architecture: consists of Front-end Participatory Tools, Knowledge-Harvesting Tools for Both Policymakers and Stakeholders, Data-Crawling Tools, Data-Gathering Tools

My analysis:

Policy Making 3.0 model and its implementation platform is an ambitious attempt and I watch it with interest. The hard part lies ahead .. ie platforms can be built but getting real intelligent discussion and input is the hard part(community engagement). I shall try and contribute to this effort through my current and ongoing work

Image source NBC news