Blog - Pagina 5 di 7 -

Blog

Is Artificial Intelligence the new disruptive wave?

  The overall Artificial Intelligence (AI) market is expected to be worth USD 16.06 Billion by 2022, growing at a CAGR of 62.9% from 2016 to 2022 (source: marketsandmarkets.com). The major factor driving the artificial intelligence market globally is the growing number of applications of AI technologies in various end-user verticals and the growing adoption…
Read more

Our life in 2030: outlooks from one of the most compelling study on AI

  How will our life be in 2030? All the answers from seventeen-member panel of the ‘The One Hundred Year Study on Artificial Intelligence’, comprised of experts in AI from academia, corporate laboratories and industry, and AI-savvy scholars in law, political science, policy, and economics that have considered recent advances in AI and potential societal…
Read more

Keep calm… My first (unforgettable) Hackathon

I was working on social a few weeks ago (yes, maybe I was slacking!) when I saw a sponsored post announcing a hackathon in Milan. It’s an event where for a long weekend a branch of technology nerds, who think they can change the world with their visions, is held in a shed for 36-48…
Read more

Experimental research: my journey in Wonderland

The scanning tunnelling microscope, for which the two scientists won a Nobel Prize and which then led to the atomic force microscope with which one can now study molecular structures and their ‘behaviour’, up to nanotechnologies for advanced diagnostics. Here’s what happens in the state-of-the-art experimental research centre where IT’s technological evolution is just a…
Read more

Just a decade and we’ll have quantum computing

  Computational calculus based on qubits and neuromorphic chips: this is the new computing expertise at work in the IBM research center in Zurich – systems that might lead to new and even more powerful cognitive systems. In a decade we might already see the first ‘standard’ quantum computer   A center of pure experimental…
Read more

The 6 rules of smart simplicity

Being fully independent in a cooperative environment to help business productivity. No, this is not an advert or even a political slogan; according to BCG, it is the essence of digital transformation  The organisational structures of modern companies are mainly inspired by ‘Taylorism’ (named after the American engineer Frederick Taylor, who begun to research methods…
Read more

The productivity paradox

Technology, created to ‘free’ employees, has often proven to be a trap, yet the ongoing digital transformation could actually ‘knock’ the productivity paradox, allowing IT to return to its role as a supporter of growth The famous American economist Robert Solow, who was awarded the Nobel Prize in Economics in 1987 for his contributions to…
Read more

Digital Supply Chain: where to begin from

Making companies ‘smarter’ by digitally managing the supply chain is now a reality. The most basic principle is always the same: work on data! In the last few decades, companies began to consider technology, lean manufacturing and global production as ways to increase efficiency and reduce costs. Today, these strategies are leading to decreasing returns…
Read more

Smart Manufacturing: towards a Digital Supply Chain

Superior performance, increased productivity, profitability and competitiveness and the ability to better satisfy customers and ‘predict’ the market. These are the results obtained by companies that invest in innovative technologies. The Boston Consulting Group analyses the leaders of the digital supply chain For years, companies have exploited digital technologies in the supply chain to improve…
Read more

Our future with nanotechnology

Today, computational power makes it impossible to emulate the human brain but, at now, it is not economically and energetically sustainable: a human brain consumes less energy to solve problems than a machine, however, increasingly intensive miniaturization enables unparalleled processing performance and intelligence system. GpGPU Computing and Quantum Computing are two areas where we expect…
Read more