Deloitte Digital’s CMO Alicia Hatch
The big 4 math disciplines that make up machine learning are linear algebra, probability theory, calculus, and statistics
Hevo Data (HD) is a Cloud-Based data integration platform that analyzes data from many sources.
For example; If order volume started to drop in one area, the team would look at web and ad traffic, customer engagement and feedback, as well as activity by competing services. That meant they needed to draw on data from Google Analytics, Mixpanel, Zendesk or Salesforce, which had to be organized and formatted first. This is a time-consuming process, and without a data integration platform, small to medium-sized companies may only be able to analyze data once or twice a week.
They plan to reduce the amount of time needed to prepare data; from weeks to a few minutes.
Their solutions would get the data ready for machine learning algorithms; since those algorithms are only as good as their training data.
They are Cloud-Neutral, so users can run it on whatever cloud platform they use, including Amazon AWS, Google Cloud Platform, and Microsoft Azure. It also offers an on-premise version of its platform for larger companies that have a private cloud.
RECORD companies and film studios have had to learn to live with internet piracy. Despite their best attempts to close sites or co-opt them, pirated copies of their wares are easily available. Increasingly, the same applies to scientific papers.
On June 21st a court in New York awarded Elsevier, a major scientific publisher, $15m in damages for copyright infringement by Sci-Hub and the Library of Genesis, two websites that offer tens of millions of scientific papers and books for anyone to download.
Both sites are increasingly popular with scientists, who use them to dodge pricey paywalls and subscriptions. Alexandra Elbakyan, who founded Sci-Hub in 2011, did not turn up for the trial (nor did the people behind LibGen). But she did send a letter outlining her reasons for starting the site. While at university in Kazakhstan she needed access to hundreds of papers for her studies. But the only way to get them, she said, was to pay $32 per paper, which she described as “just insane”. Having discovered other academics using the internet to trade copies of papers they could not pay for, she set up Sci-Hub to streamline the process.
An analysis of Sci-Hub’s server logs, published in Science in 2016, found its biggest users were people in Iran, India, and China. Such middle-income countries do not qualify for the subsidies big publishers provide to users in the poorest nations, but their universities nevertheless may not be able to afford subscriptions. Not every downloader was cash-strapped, though. Americans were the fifth-biggest users.
Ms. Elbakyan sees the website as a way to make the fruits of science available to researchers whose institutions cannot afford steep fees as well as to anyone else interested. She thinks of it as a radical version of “open access”, the idea that research—which is, after all, mostly funded through taxes—should be published in a way that makes it available to everyone. Unsurprisingly, publishers have little patience for such arguments. Elsevier argues that there is more to publishing than simply shoveling papers online and that work such as editing and arranging for reviews has to be paid for.
Both Sci-Hub and LibGen are based in Russia, beyond the reach of America’s courts. Nonetheless, the American Chemical Society, which publishes several journals, announced on June 28th that it had launched a lawsuit of its own. Provided Ms. Elbakyan does not travel to America, that lawsuit seems equally unlikely to succeed.
Ms. Elbakyan, though, may soon receive an invitation to visit America that does not come through legal channels: she has been tipped as a possible inaugural winner of the Disobedience Award, run by the Massachusetts Institute of Technology (MIT). The award was founded partly to commemorate Aaron Swartz, a former MIT student who also believed that academic papers should be freely available. After downloading millions of them from JSTOR, a paywalled repository, he was charged with hacking. He killed himself in 2013, shortly before his trial. If she does win, Ms. Elbakyan would presumably not attend the ceremony, although the magic of the internet might allow her to accept the gong remotely.
Source: The Economist , VOX
The world’s leading drug companies are turning to artificial intelligence to improve the hit-and-miss business of finding new medicines, with GlaxoSmithKline unveiling a new $43 million deal in the field on Sunday.
Other pharmaceutical giants including Merck & Co, Johnson & Johnson and Sanofi are also exploring the potential of artificial intelligence (AI) to help streamline the drug discovery process.
The aim is to harness modern supercomputers and machine learning systems to predict how molecules will behave and how likely they are to make a useful drug, thereby saving time and money on unnecessary tests.
AI systems already play a central role in other high-tech areas such as the development of driverless cars and facial recognition software.
“Many large pharma companies are starting to realize the potential of this approach and how it can help improve efficiencies,” said Andrew Hopkins, chief executive of privately owned Exscientia, which announced the new tie-up with GSK.
Hopkins, who used to work at Pfizer, said Exscientia’s AI system could deliver drug candidates in roughly one-quarter of the time and at one-quarter of the cost of traditional approaches.
The Scotland-based company, which also signed a deal with Sanofi in May, is one of a growing number of start-ups on both sides of the Atlantic that are applying AI to drug research. Others include U.S. firms Berg, Numerate, twoXAR and Atomwise, as well as Britain’s BenevolentAI.
“In pharma’s eyes these companies are essentially digital biotechs that they can strike partnerships with and which help feed the pipeline,” said Nooman Haque, head of life sciences at Silicon Valley Bank in London.
“If this technology really proves itself, you may start to see M&A with pharma, and closer integration of these AI engines into pharma R&D.”
STILL TO BE PROVEN
It is not the first time drugmakers has turned to high-tech solutions to boost R&D productivity.
The introduction of “high-throughput screening,” using robots to rapidly test millions of compounds, generated mountains of leads in the early 2000s but notably failed to solve inefficiencies in the research process.
When it comes to AI, big pharma is treading cautiously, in the knowledge that the technology has yet to demonstrate it can successfully bring a new molecule from computer screen to the lab to the clinic and finally to market.
“It’s still to be proven, but we definitely think we should do the experiment,” said John Baldoni, GSK’s head of platform technology and science.
Baldoni is also ramping up in-house AI investment at the drugmaker by hiring some amazing staff with appropriate computing and data handling experience – including astrophysicists.
His goal is to reduce the time it takes from identifying a target for disease intervention to finding a molecule that acts against it from an average 5.5 years today to just one year in future.
“That is a stretch. But as we’ve learned more about what modern supercomputers can do, we’ve gained more confidence,” Baldoni told Reuters. “We have an obligation to reduce the cost of drugs and reduce the time it takes to get medicines to patients.”
Earlier this year GSK also entered a collaboration with the U.S. Department of Energy and National Cancer Institute to accelerate pre-clinical drug development through the use of advanced computational technologies.
The new deal with Exscientia will allow GSK to search for drug candidates for up to 10 disease-related targets. GSK will provide research funding and make payments of 33 million pounds ($43 million) if pre-clinical milestones are met.