Student Paper Notes: The structure and function of complex networks by M.E.J. Newman

This time the student notes I post are coming from the reading of a scientific paper written by Mark Newman, Professor of Physics in the University of Michigan.

The link to the paper is this one. It is a 58-page paper with 419 references on complex networks. As an enticing intro to suggest its reading, nothing else is required. It complements the previous notes that I published on the Network Science book authored by Albert-László Barabási. Compared to the Network Science book, this paper is slightly more condensed.

This paper contains relevant historical references on the field of Network Science and it has also a considerable mathematical component.

The following points are notes mostly literally extracted from this paper. The intent of this post is not to create new content but rather to provide Network Science students with a (literal and non-comprehensive) summary of this paper.

Why this Information Security blog touches the field of Network Science? I am convinced that we can apply Network Science learning points to our Information Security real-life scenarios.

As always, a little disclaimer: These notes do not replace the reading of the paper, they are just that, some student notes (or fragments) of the paper.


- This paper reviews recent and non-recent work on the structure and function of networked systems.
- Vertices are nodes and edges are links.
- Degree: The number of edges connected to a vertex.
- Field of research: Consideration of large-scale statistical properties of graphs.
- The body of theory of Network Science has three aims: First, to find statistical properties like path lengths and degree distributions that define structure and behaviour of networked systems and to suggest appropriate ways to measure these properties. Second, to propose models of networks and third, to predict the behaviour of networked systems.
- A hyperedge joins more than two vertices together.

Networks in the real world

- An acyclic graph has no closed loops. The WWW is in general cyclic.
- The small-world effect: Most pairs of vertices in most networks seem to be connected by a short path through the network.

Properties of networks

- Network transitivity (or, sometimes, clustering): "the friend of your friend is likely to be your friend".
- The clustering coefficient C is the mean probability that the friend of your friend is also your friend.
- The clustering coefficient C measures the density of triangles in the network.
- The probability that two vertices point to each other in a directed network is called reciprocity. In a directed network edges have a sense.

- In a random network, in which each edge is present or absent with equal probability, the degree distribution is binomial or Poisson in the limit of large graph size.
- Real world networks are rarely random in their degree distributions.
The degree of the vertices in most real networks are highly right-skewed i.e. their degree distributions have a long right tail of values that are far above the mean.
- Networks with power-law degree distributions are referred as scale-free (degree distribution).
- The maximum degree of a vertex in a network will in general depend on the size of the network.

- Network resilience: Distance was almost entirely unaffected by random vertex removal. However, when removal targets highest degree vertices, it has devastating effects. An example of this is the Internet.

- Assortative mixing or homophily: selective linking. A classical example of assortative mixing in social networks is mixing by race.
- In some networks, high-degree vertices tend to associate with other high-degree vertices. Most social networks appear to be assortative, other types of networks (biological, technological and information) appear to be disassortative.
- The traditional method for extracting community structure from a network is cluster analysis (also called hierarchical analysis).
- Community structure is a common network property.

- Navigation: Stanley Milgram's small-world experiment. Short paths exist in the network and network members are good at finding them. This is important e.g. for efficient databases structures or better peer-to-peer computer networks.
- Betweenness centrality of a vertex in a network is the number of geodesic paths between other vertices that run through it. Betweenness centrality can also be viewed as a measure of network resilience.

Random graphs

- Poisson random graphs (or Bernoulli graphs) are not adequate to describe some important properties of real-world networks.
- The most important property of a random graph is that it possesses a phase transition, from a low-density with few edges and all components being small, having an exponential size distribution and finite mean size to a high density phase in which an extensive fraction of all vertices are joined together in a single giant component.
- The random graph reproduces well the small-world effect, however, in almost all other respects, the properties of a random graph do no match those of real-world networks.
- A random graph has a Poisson degree distribution, entirely random mixing patterns and no correlation between degrees of adjacent vertices, no community structure and, finally, navigation is not possible using local algorithms.
- The property of real graphs that is simplest to add to random graphs is the non-Poisson degree distributions i.e. the "configuration model". An example of this would be a network with a power-law degree distribution.
- Other examples are: directed graphs, bipartite graphs (which have two types of vertices and edges running only between vertices of unlike types - these ones are sometimes studied using "one-mode" projections).
- An additional random graph model for degree correlation is the exponential random graph. A more specialised model is proposed by Maslov.

Exponential random graphs and Markov graphs

- The only solvable random graph models that currently incorporate transitivity are the bipartite and the community-structured models plus certain dual graph models.
- Progress in understanding transitivity require different (and new) models.

The small-world model

- A less sophisticated but more tractable model of a network with high transitivity. However, the degree distribution of the small-world model does not match most real-world networks very well.

- A lot of attention has been given to the average geodesic path length of the small-world model.

Models of network growth

- The best studied class of network models aim to explain the origin of highly skewed degree distributions.
- Probably Price described the first example of a scale-free network, the network of citations between scientific papers.
- Power laws arise when "the rich get richer" i.e. the amount you get goes up with the amount you already have. Price named that "cumulative advantage". Barabasi and Albert named that "preferential attachment".
- z lately denotes degree (and not m) i.e. the total number of edges in the graph.
- The mechanism of cumulative advantage proposed by Price is widely accepted as the explanation for the power-law degree distribution in real-world networks such as the WWW, the citation network and possibly the Internet.
- The difference between the Price model and the Barabasi and Albert model is that in the latter one the edges are undirected, so there is no distinction between in and out degree. The Barabasi and Albert model is simpler and slightly less realistic.
- There is a correlation between the age of vertices and their degrees, with older vertices having higher mean degree.
- Krapivsky and Redner show that there are correlations between the degrees of adjacent vertices in the model.
- The assumption of linear preferential attachment seems to be a reasonable approximation to the truth.
- The real WWW does not present the correlations between age and degree of vertices as found in the Barabasi and Albert model. This is, according to Adamic and Huberman, because the degree of vertices is also a function of their intrinsic worth.
- Bianconi and Barabasi have presented an extension of the Barabasi and Albert model: Each newly appearing vertex is given a "fitness" that represent its attractiveness i.e. its propensity to accrue new links.
- Price's model is intended to be a model of a citation network. Citation networks are really directed, acyclic and all vertices (approximately) belong to a single component, unless they do not cite and are not cited.
- Simple growth model by Callaway et al.: Vertices normally have degree zero when they are first added to the graph. This model does not show preferential attachment so no power-law distributions but exponential.
- Some networks appear to have power-law degree distributions but they do not show preferential attachment e.g. biochemical interaction networks. These networks could grow by copying vertices.

Processes taking place on networks

- Looking at the behaviour of models of physical, biological, social processes going on these networks.
- In Physics, vertices are sites and edges are bonds.
- A percolation process is one in which vertices or edges on a graph are randomly designated either "occupied" or "unoccupied", asking about various properties of the resulting vertices patterns.
- The problem of resilience to random failure of vertices in a network is equivalent to a site percolation process on the network. The number of remaining vertices that can still successfully communicate is precisely the giant component of the corresponding percolation model.
- Networks with power-law degree distributions are highly susceptible to targeted attacks: one only need to remove a small percentage of vertices to destroy the giant component entirely.
- Cascading failures: Watts provided a simple model for cascading failures as a type of percolation. It could be solved using generating function methods similar to those for simple vertex removal.

Epidemiological processes

- The SIR model divides the population into three classes: Susceptible, Infected and Recovered (with permanent illness immunity).
- Deseases do not always spread on scale-free networks.
- Vaccination can be modeled as a site percolation process.
- As networks tend to be particularly vulnerable to the removal of their highest degree vertices, targeted vaccination is expected to be particularly effective.
- It is not always easy to find the highest degree vertices in a social network.
- One is more likely to find high-degree vertices by following edges than by choosing vertices at random.
- Therefore, a population can be immunised by choosing a random person and vaccinating a friend of that person and then repeating again the process.
- The SIS model: an example is computer viruses.
- At least in networks with right-skewed degree distributions, propagation of the disease turns out to be relatively robust against random vaccionations but highly susceptible to vaccination of the highest-degree individuals.

Exhaustive search 

- A page is important if it is linked by many many pages.

Guided search

- Performs small special-purpose crawls.
- It relies on the assumption that pages containing a particular information on a particular topic tend to be clustered together in local regions of the graph.

Network navigation

- The objective is to design networks structures that make a particular search algorithm perform well.
- The "social distance" is measured by the height of the lowest level in the tree at which they are both connected. In other words, how far one must go up the tree to find the lowest “common ancestor” of the pair.

Phase transition on networks

- E.g. models of opinion formation in social networks.

Other processes on networks

- Voter models, difussion, genetic regulatory models, etc.

- The study of complex networks is still in its infancy. 

Happy networking!

The network castle


Student Book Notes: Network Science by Albert-L. Barabasi - A powerful new field

Attending the 2015 summer conferences organised by @cigtr I came accross a book authored by Albert-László Barabási titled "Network Science". Actually it was Mathematics Professor Regino Criado who hinted me the name of Barabasi.

The book opens minds and new knowledge fields using Mathematics. It is worth reading and studying! Actually all chapters and other resources can be found in the book's site. Thanks to the author for making it freely available under the Creative Commons licence.

I read all ten chapters. I highlighted some sentences from each of the chapters. I enumerate some of those highlighted points as if this post were a brief collection of notes on the book, hoping that more than one of my blog's readerr will decide to embark on reading the book after going through this introductory post. Network Science students could use this post as a quick (and very initial) cheat sheet.

Happy networking! 

Chapter X - Preface

Understanding networks today is required to understand today's world. This section describes how the text book is used in a Network Science class.

Chapter 0 - Personal Introduction

This chapter describes how the author got trapped by the beauty and the importance of networks. He already mentions contributions such as Bella Bollobas' on random graphs and the work of Erdos and Renyi. It talks also about the difference between social scientists and graph theorists.

Key introductory statements:

- "A simple model, realying on growth and preferential attachment could explain the power laws spotted on some networks".

- "Scale-free networks proved to be surprisingly resistant to failures but shockingly sensitive to attacks".

Chapter 1 - Intro

- "The interplay between network structure and dynamics affects the robustness of the system".
- In a complex system it is difficult to derive the collective behaviour from the knowledge of the system's components.
- "Most networks are organised by the same principles".
- "The most succesful companies of the 21st Century base their technology and business model on networks".
- Epidemic transmission is one real example of the applicability of this new maths-based science.

Chapter 2 - Graph Theory

- "Graph theory is the mathematical scaffold behind network science".
- A path goes through all nodes only once. "A path cannot exist on a graph that has more than two nodes with an odd number of links". 
- Network parameters: Number of nodes, number of links, directness or undirectness of links.
- The choice of nodes and links is important when describing a network.
- Node degree is the number of links to other nodes.
- Average degree in a network: An important variable to play with.
- In directed networks, we talk about incoming degree and outgoing degree.
- Total number of links is denoted by L.
- Average degree k= 2L/N being N total number of nodes.
- "Degree distribution provides the probability that a randomly selected node in the network has degree k".
- "The number of degree-k nodes can de obtained from the degree distribution as N(k)=Np(k)".
- "The adjancency matrix of an undirected network is symmetric".
- "For weighted networks the elements of the adjancency matrix carry the weight of the link".
- Metcalfe's law states that the value of a network is proportional to the square of the number of its nodes.
- Bipartite networks can be divided into two disjoints sets of nodes such that each link connect a node from a set to a node from the other set.
- A path length consists of the number of links it contains.
- In networks physical distance is replaced by path length.

Note: I will not use "" signs in this post anymore. All points are extracted from the mentioned book. Please consider the existence of "" signs i.e. literal or almost literal words coming from the reviewed book in all points. I also informed Albert-László Barabási about the publicacion of this post.

- Distance between nodes changes if the network is directed i.e. d(A,B) maybe is not equal to d(B,A).
- Connected and disconnected networks (disconnected if there is at least a pair of nodes with infinite distance).
- A bridge is any link that, if cut, disconnects the network.
- The clustering coefficient measures the network's local link density.
- The maximal distance in a network is the diameter. The breadth-first-search algorithm helps finding it.

Chapter 3 - Random networks

- A random network is a collection of N nodes where each node pair is connected with probability p.
- A cocktail party chitchat scenario is an example of a random network.
- The degree distribution of a random network has the form of a Poisson distribution.
- The random network model does not capture the degree distribution of real networks. Nodes in random networks have comparable degrees, forbidding hubs (highly connected nodes).
- We have a giant component if and only if each node has on average more than one link.
- Evolution of a random network in function of the average degree k: Subcritical, critical, supercritical and connected.
- The small world phenomenon implies that the distance between two randomly chosen nodes in a network is short.
- Most real networks are in the supercritical regime.
- Real networks have a much higher clustering coefficient than expected for a random network of similar N and L.
- Real networks are not random.
- The random network model is important in network science. Features of real networks not present in random networks may represent some signature of order.

Chapter 4 - The scale-free property

- Let's remember that in a random network there are no highly connected nodes (hubs).
- The existence of hubs (e.g. in the WWW) is a signature of a network organising principle called the scale-free property.
- The degree distribution of a scale-free network follows a power law, and not a Poisson distribution like in random networks
- A scale-free network has a large number of small degree nodes, larger than in a random network.
- In a Poisson distribution (random network), most of the nodes have the same amount of links (the size of the largest node grows logarithmically or slower with N, the number of nodes).
- In a power-law distribution (scale-free network) many nodes have only a few links and there are a few hubs with a large number of links (widely different degrees, spanning several orders of magnitude).
- The larger the network, the larger the degree of its biggest hub (it grows polynomially with the network size).
- Random networks have a scale: Nodes have comparable degrees and the average degree serves as the scale of a random network.
- The scale-free property is missing in those networks that limit the number of links that a node can have.
- Ultra-small world property: Distances in a scale-free network are smaller that in a equivalent random network.
- The bigger the hubs, the more effectively they shrink distances between nodes.
- Scale-free networks are ultra-small when the value of the degree exponent is between 2 and 3.
- The configuration model, the degree-preserving randomization and the hidden parameter model can generate networks with a pre-defined degree distribution.
 - Erdos-Renyi and Watts-Strogatz described exponentially bounded networks. They lack outliers. Most nodes have comparable degrees (e.g. the power grid and highway networks). In these networks, a random network model is a starting point.
- In fat-tailed distributions, a scale-free network offers a better approximation.

Chapter 5 - The Barabasi-Albert model

- In scale-free networks, nodes prefer to link with the most connected nodes (preferential attachment).
- Growth and preferential attachment are responsible, and simultaneoulsy needed, for the emergence of scale-free networks.
- Older nodes have an advantage to become hubs over the time.
- The Barabasi-Albert model generates a scale-free network with degree exponent =3.
- To date all known models and real systems that are scale-free have preferential attachment.

Chapter 6 - Evolving networks

- The Bianconi-Barabasi model can account for the fact that nodes with different internal characteristics acquire links at different rates.
- The growth rate of a node is determined by its fitness. This model allows us to calculate the dependence of the degree distribution on the fitness distribution.
- Fitness distribution is typically exponentially bounded. That means that fitness differences between different nodes are small. With time these differences are magnified resulting in a power law degree distribution.
- Bose-Einstein condensation: That means that the fittest node grabs a finite fraction of the links, turning into a super hub creating a hub and spoke topology (the rich-gets-richer process or winner takes all phenomenon) and losing the network its scale-free nature.
- In most networks, nodes can disappear.
- As long as it continues to grow, its scale-free nature can persist.

Chapter 7 - Degree correlation

- A way to go deeper into understanding network structures based on maths.
- In some networks, hubs tend to have ties to other hubs. That is an assortative network. In disassortative networks, hubs avoid each other.
- A network displays degree correlations if the number of links between the high and low-degree nodes is systematically different from what is expected by chance.
- There is a conflict between degree correlation and the scale-free property. Hubs should be linked among each other with more that one link. 
- Assortative mating reflects the tendency of individuals to date or marry individuals that are similar to them.

Chapter 8 - Network robustness

Once the fraction of removed nodes reaches a critical threshold in a random network, the network abruptly breaks into disconnected components. Percolation theory can be used to describe the transition in random or Erdos-Renyi networks i.e. networks with equal or comparable number of nodes.

Real networks show robustness against random failures. Scale-free networks show a greater degree of robustness against random failures. However, an attack that targets a hub can easily destroy a scale-free network. Depending on the network (the WWW, or a disease propagation), this can be bad or good news.

The failure propagation model and the branching model (plus the overload model and the sandpile model in the critical regime) captures the behaviour of cascading failures. All these models predict the existence of a critical state in which the avalanche sizes follow a power law.

A network that is robust to both random failures and attacks has a hub and many nodes with the same degree i.e a hub-and-spoke topology.

Chapter 9 - Communities

A community is a locally dense connected subgraph in a network. There are weak and stron communities depending on the internal and external number of links of the nodes.

The number of potential partitions in a network grow faster than exponentially with the network size.

The higher a node's degree, the smallest is its clustering coefficient.

Randomly wired networks lack an inherent community structure.

Modularity measures the quality of each partition. Modularity optimization offers a novel approach to community detection.

For a given network the partition with maximum modularity corresponds to the optimal community structure.

A node is rarely confined to a single community. However links tend to be community specific.

The development of the fastest and the most accurate community detection tool remains an active arms race.

The community size distribution is typically fat-tailed, indicating the existence of many small communities with a few large ones.

Community finding algorithms run behind many social networks to help discover potential friends, posts of interests and target advertising.

Chapter 10 - Spreading phenomena

 A super-spreader is an individual responsible for a disproportionate number of infections during an epidemic.

Network epidemics offer a model to explain the spread of infectious diseases.

The homogenous mixing hypothesis (also named fully mixed or mass action approximation) assumes that each individual has the same chance of coming into contact with an infected individual.

Different models capture the dynamics of an epidemic outbreak (Suceptible-Infected, Susceptible-Infected-Susceptible and the Susceptible-Infected-Recovered).

In a large scale-free network a virus can reach instantaneously most nodes and even viruses with small spreading rate can persist in the population.

The human sexual encounter network is a scale-free network.

Bursty interactions are observed in a number of contact processes of relevance for epidemic phenomena.

In assortative netoworks, high degree nodes tend to link with high degree nodes.
Equally, strong ties tend to be within communities while weak ties are between them.

Several network characteristics can affect the spread of a pathogen in a network (e.g. degree correlations, link weights or a bursty contact pattern).

In a scale-free network, random immunization does not erradicate a desease. Selective immunization targeting hubs help eradicate the disease.

The friendship paradox: On average the neighbours of a node have a higher degree than the node itself. So, let's immunize neighbours of randomly selected nodes.

Travel restrictions do not decrease the number of infected individuals. They only delay the outbreak, giving maybe time to expand local vaccinations.

We can use the effective distance (different from the physical distance) to determine the speed of a pathogen.

All in all, a recommendable reference for those willing to get introduced into the Network Science field.

I will be happy to extend this post with comments coming from readers of the "Network Science" book.

Let's network!

Security site to bookmark:

An elegant way to sell security

Every now and then we need to get a chance to slow down our professional tactical everyday pace and think strategically. For those moments, I propose to visit Lares is a boutique-alike security company founded by Chris Nickerson and staffed also by Eric M. Smith. Both are reputable security professionals that have greatly contributed to the security community.

Chris Nickerson conducted the famous and irreverent "Exotic Liability" security podcast. Unfortunately, the last available episode dates already from 2013. Chris is also a regular presenter at many international security conferences. He is also one of the authors of the Penetration Testing Execution standard. The amount of followers he has in his twitter account confirms his relevance in the community.

Eric M. Smith has also presented at events such as DefCon 22 in 2014: Along with Josh Perrymon, they studied the topic of RFID chip security.

Let's go now through some of the sections of the site:

- Lares in action can inspire us to come up with alternative ideas to the traditional way of creating and selling security services. It contains more than a dozen videos, and their presentations, of appearances at conferences like BSides, Troopers or Source Barcelona. The historical and practical approaches they propose on how to implement security are worth thinking about.

As an example, we find almost 80 pages on how to increase the value of traditional security testing (both for vulnerability management and penetration testing). That slidedeck is not only fun but also innovative. They use useful concepts such as insider threat assessment, adversary modeling and the
continuous implementation of security tests along any technological process.

- It is great to see in their services section that, together with traditional vulnerability assessments and security testing, they also offer business impact analysis as a value added security deliverable.

- There is a social engineering section, labeled "Layer 8 labs". This is an appropriate name considering the human element as another layer on top of the 7 layers of the OSI communication model. "Layer 8 labs" provide controlled  "phishing" campaigns to increase security awareness among employees in companies and organisations.

As a final comment, I would highlight the modern design of this website: It helps underlining the valuable security content they provide.

Happy ninja reading!

Adversary modeling

Security site to bookmark:

Sharing information about real threats and real attacks

We human beings live in communities. The threats that may affect our group are an important information element communicated to our peers. This piece of information brings greater preparedness against potential risks. In addition, if those risks do really materialise, a faster and more effective reaction is possible.

Something similar can be seen on the Internet: is an example of this. It proposes an automated way to share information about real threats on the Internet.

According to its homepage, OpenIOC "facilitates the exchange of indicators of compromise ("IOCs") in a computable format" i.e. ready to be processed by information systems such as intrusion detection systems and application layer filtering firewalls.

Each compromise indicator contains three elements:
- First, the metadata, which provide contextual information such as the author of the indicator, the name of the indicator and a brief description.
- Second, references, so you can link the indicator to a particular wave of attacks.
- Third, its definition, which describes its specific infection mechanisms and operation.

A valuable detail of this format is the possibility of using Boolean logic to filter indicators automatically.

OpenIOC is an extensible XML encoding protocol initially designed for Mandiant security products such as "IOC Editor", a free XML editor for indicators of compromise, and "Redline", a compromise verification tool for Windows installations, also free.

Security incident responders were interested in this initiative and finally Mandiant OpenIOC standardised and made it available to the open source community ("open source") in 2011.

OpenIOC is currently an open initiative. For example, in OpenIOC Google Groups there is a very active forum where you can get information on how to use this format with log analysis tools like "Splunk" or references of indicator repositories such as

Based on the increasing number of security incidents on the Internet, related information sharing will grow over the coming years, especially among companies with a similar risk profile.

Perhaps a pending task of this project is to implement a non-intrusive compromise detection service for end users outside major corporations.

Happy protection!

You can also read this post in Spanish here.

Fly high!

Book Review: Steve Jobs Hardcover by Walter Isaacson - Lessons for Information Security?

Steve Jobs by Walter Isaacson - Lessons for Information Security?

I went through Mr. Isaacson's Steve Jobs' biography and I would like to share with my information security community some, very personal and biased, learning points, potentially applicable to our industry.

As always, the intent of this post is not to replace the reading of this highly interesting and very well written book by Walter Isaacson.

- The author talks about reality distortion fields and how some people live on them and the difficulty to interact with them for the rest of the mortals when we realise that their reality is different to ours.

In information security there are many people within reality distorting fields.

- However, there is a positive side to reality distortion fields, sometimes they become reality if effort, passion and innovation (and luck? and timing?) kick in.

Totally applicable to information security.

- Even introverts need a dense and effective network to succeed in business.

Frequently forgotten in Infosec.

- Successful business people does not equal successful parents.
- Successful business people does not equal ethical colleagues.

- Selling abilities are key in every social aspect of our lives (business, social, family).
- Some things definitely cannot be patented.
- Money is a hygienic motivational element.

- You can shift working passions during a long period of time (11 years passed since he was ousted from Apple until he came back).

Also very applicable (but hardly applied) to infosec people.

- Charismatic people use to have more troublesome lives than peacefully smooth characters.

Just go and attend any security conference, mingle with people around, and you will confirm this statement.

- The way a company is run can benefit hugely from innovation. We can innovate in the way we manage a company, or a team.

Totally applicable to information security.

- Brutal honesty is a management approach, up to the actors (the sender and recipient) to accept it or not.

- Marketing is key - so key that every Wednesday afternoon, every week, the CEO would meet their marketing people.

Marketing, the forgotten element in most Information Security units.
- Do you control the end-to-end the experience your customer or user goes through when using your product or your service?

Innovative element that security practitioners can apply from day one when they design their deliverables.

- Internal product cannibalisation? Go for it - Otherwise other companies and products will cannibalise yours.

Applicable to our information security products? Certainly. Let's do it.

- Persistence: key for success. Sometimes it's years what we need to devote for
something to succeed.

Is our industry persistent enough? Nice topic for a discussion.

- The second-product effect, if your company does not know why their first product was a success, then they will fail with their second product.

Have it in mind when expanding your security portfolio.

- Electronic communication is fine, but it you want to trigger and foster innovation, make physical communication, face to face, happen.

A piece of wisdom here for our industry, in which we overuse non-physical communication channels.

- Do not mention the ideas you have for a new product before you launch it... or someone else will be faster than you.

Already applied in our industry ;-)

- Privacy and running a publicly traded company create sometimes some conflicts.

Difficult to accept sometimes, but privacy is (or will be) already gone?

- Sometimes a product launch is a failure and later on it gets transformed into a historic breakthrough, especially if you use powerful marketing to let people know how they can use it.

Again, a link to smart marketing that in our industry still does not exist.

- Sometimes, going through serious health problems does not make rough characters softer.

- A feature of apple's culture: "accountability is strictly ensured".

Tough but effective.

- One of the next revolutions to come: textbooks and education. They really did not change a lot in many years.

Are we still on time in terms of securing the coming new learning experience?

- The clash of two different technology philosophies, open versus closed in terms of where software runs and the different approach Microsoft and Apple followed.

- Things can really change (although most of the times you need time, passion and patience). E.g. in 2000 Apple was worth only a twentieth of the value that Microsoft had in the market; in 2010 Apple surpassed Microsoft in market value.

- You choose, in business, either devote the time to start dying or devote the time to start being re-born.

- And, last but not least, when death comes to visit us, we all strive to get that piece of mind that was difficult to find during our lives with our people (family, relatives, colleagues, etc.).

Happy innovation!

Being fast!