Categories
Anchorage Andreessen Horowitz artificial intelligence blockchain Blockchain Capital cryptocurrency Custody Enterprise Exit Fundings & Exits Libra Association M&A Nathan McCauley Security Startups TC visa

The crypto rich find security in Anchorage

Not the city, the $57 million-funded cryptocurrency custodian startup. When someone wants to keep tens or hundreds of millions of dollars in Bitcoin, Ethereum, or other coins safe, they put them in Anchorage’s vault. And now they can trade straight from custody so they never have to worry about getting robbed mid-transaction.

With backing from Visa, Andreessen Horowitz, and Blockchain Capital, Anchorage has emerged as the darling of the cryptocurrency security startup scene. Today it’s flexing its muscle and war chest by announcing its first acquisition, crypto risk modeling company Merkle Data.

Anchorage Security

Anchorage founders

Anchorage has already integrated Merkle’s technology and team to power today’s launch of its new trading feature. It eliminates the need for big crypto owners to manually move assets in and out of custody to buy or sell, or to set up their own in-house trading. Instead of grabbing some undisclosed spread between the spot price and the price Anchorage quotes its clients, it charges a transparent per transaction fee of a tenth of a percent.

It’s stressful enough trading around digital fortunes. Anchorage gives institutions and token moguls peace of mind throughout the process while letting them stake and vote while their riches are in custody. Anchorage CEO Nathan McCauley tells me “Our clients want to be able to fund a bank account with USD and have it seamlessly converted into crypto, securely held in their custody accounts. Shockingly, that’s not yet the norm–but we’re changing that.”

Buy and sell safely

Founded in 2017 by leaders behind Docker and Square, Anchorage’s core business is its omnimetric security system that takes passwords that can be lost or stolen out of the equation. Instead, it uses humans and AI to review scans of your biometrics, nearby networks, and other data for identity confirmation. Then it requires consensus approval for transactions from a set of trusted managers you’ve whitelisted.

With Anchorage Trading, the startup promises efficient order routing, transparent pricing, and multi-venue liquidity from OTC desks, exchanges, and market makers. “Because trading and custody are directly integrated, we’re able to buy and sell crypto from custody, without having to make risky external transfers or deal with multiple accounts from different providers” says Bart Stephens, founder and managing partner of Blockchain Capital.

Trading isn’t Anchorage’s primary business, so it doesn’t have to squeeze clients on their transactions and can instead try to keep them happy for the long-term. That also sets up Anchorage to be foundational part of the cryptocurrency stack. It wouldn’t disclose the terms of the Merkle Data acquisition, but the Pantera Capital-backed company brings quantative analysts to Anchorage to keep its trading safe and smart.

“Unlike most traditional financial assets, crypto assets are bearer assets: in order to do anything with them, you need to hold the underlying private keys. This means crypto custodians like Anchorage must play a much larger role than custodians do in traditional finance” says McCauley. “Services like trading, settlement, posting collateral, lending, and all other financial activities surrounding the assets rely on the custodian’s involvement, and in our view are best performed by the custodian directly.”

Anchorage will be competing with Coinbase, which offers integrated custody and institutional brokerage through its agency-only OTC desk. Fidelity Digital Assets combines trading and brokerage, but for Bitcoin only. BitGo offers brokerage from custody through a partnership with Genesis Global Trading. But Anchorage hopes its experience handling huge sums, clear pricing, and credentials like membership in Facebook’s Libra Association will win it clients.

McCauley says the biggest threat to Anchorage isn’t competitors, thoguh, but hazy regulation. Anchorage is building a core piece of the blockchain economy’s infrastructure. But for the biggest financial institutions to be comfortable getting involved, lawmakers need to make it clear what’s legal.

Categories
artificial intelligence Cloud Cloud Computing cloud infrastructure computing Enterprise google google cloud machine learning oracle oracle cloud oracle corporation sap TC

Google brings IBM Power Systems to its cloud

As Google Cloud looks to convince more enterprises to move to its platform, it needs to be able to give businesses an onramp for their existing legacy infrastructure and workloads that they can’t easily replace or move to the cloud. A lot of those workloads run on IBM Power Systems with their Power processors, and, until now, IBM was essentially the only vendor that offered cloud-based Power systems. Now, however, Google is also getting into this game by partnering with IBM to launch IBM Power Systems on Google Cloud.

“Enterprises looking to the cloud to modernize their existing infrastructure and streamline their business processes have many options,” writes Kevin Ichhpurani, Google Cloud’s corporate VP for its global ecosystem, in today’s announcement. “At one end of the spectrum, some organizations are re-platforming entire legacy systems to adopt the cloud. Many others, however, want to continue leveraging their existing infrastructure while still benefiting from the cloud’s flexible consumption model, scalability, and new advancements in areas like artificial intelligence, machine learning, and analytics.”

Power Systems support obviously fits in well here, given that many companies use them for mission-critical workloads based on SAP and Oracle applications and databases. With this, they can take those workloads and slowly move them to the cloud, without having to re-engineer their applications and infrastructure. Power Systems on Google Cloud is obviously integrated with Google’s services and billing tools.

This is very much an enterprise offering, without a published pricing sheet. Chances are, given the cost of a Power-based server, you’re not looking at a bargain, per-minute price here.

Because IBM has its own cloud offering, it’s a bit odd to see it work with Google to bring its servers to a competing cloud — though it surely wants to sell more Power servers. The move makes perfect sense for Google Cloud, though, which is on a mission to bring more enterprise workloads to its platform. Any roadblock the company can remove works in its favor, and, as enterprises get comfortable with its platform, they’ll likely bring other workloads to it over time.

Categories
artificial intelligence Enterprise retail Robotics Zebra Technologies

Zebra’s SmartSight inventory robot keeps an eye on store shelves

How many times have you gone into a store and found the shelves need restocking of the very item you want? This is a frequent problem, and it’s difficult, especially in larger retail establishments, to keep on top of stocking requirements. Zebra Technologies has a solution: a robot that scans the shelves and reports stock gaps to human associates.

The SmartSight robot is a hardware, software and services solution that roams the aisles of the store checking the shelves, using a combination of computer vision, machine learning, workflow automation and robotic capabilities. It can find inventory problems, pricing glitches and display issues. When it finds a problem, it sends a message to human associates via a Zebra mobile computer with the location and nature of the issue.

The robot takes advantage of Zebra’s EMA50 mobile automation technology and links to other store systems, including inventory and online ordering systems. Zebra claims it increases available inventory by 95%, while reducing human time spent wandering the aisles to do inventory manually by an average of 65 hours per week.

While it will likely reduce the number of humans required to perform this type of task, Zebra’s senior vice president and general manager of Enterprise Mobile Computing, Joe White, says it’s not always easy to find people to fill these types of positions.

“SmartSight and the EMA50 were developed to help retailers fully capitalize on the opportunities presented by the on-demand economy despite heightened competition and ongoing labor shortage concerns,” White said in a statement.

This is a solution that takes advantage of robotics to help humans keep store shelves stocked and find other issues. The SmartSight robot will be available on a subscription basis starting later this quarter. That means retailers won’t have to worry about owning and maintaining the robot. If anything goes wrong, Zebra would be responsible for fixing it.

Zebra made the announcement at the NRF 2020 conference taking place this week in New York City.

Categories
Access Industries alibaba artificial intelligence battery ventures Bessemer Venture Partners business intelligence business software Claltech computing Deezer dfj growth Enterprise Insight Partners Insight Venture Partners Israel machine learning Periscope periscope data Philips Recent Funding Salesforce salvation army sisense Snap Spotify Startups Tinder Warner-Music-Group Zalando

Sisense nabs $100M at a $1B+ valuation for accessible big data business analytics

Sisense, an enterprise startup that has built a business analytics business out of the premise of making big data as accessible as possible to users — whether it be through graphics on mobile or desktop apps, or spoken through Alexa — is announcing a big round of funding today and a large jump in valuation to underscore its traction. The company has picked up $100 million in a growth round of funding that catapults Sisense’s valuation to over $1 billion, funding that it plans to use to continue building out its tech, as well as for sales, marketing and development efforts.

For context, this is a huge jump: The company was valued at only around $325 million in 2016 when it raised a Series E, according to PitchBook. (It did not disclose valuation in 2018, when it raised a venture round of $80 million.) It now has some 2,000 customers, including Tinder, Philips, Nasdaq and the Salvation Army.

This latest round is being led by the high-profile enterprise investor Insight Venture Partners, with Access Industries, Bessemer Venture Partners, Battery Ventures, DFJ Growth and others also participating. The Access investment was made via Claltech in Israel, and it seems that this led to some details of this getting leaked out as rumors in recent days. Insight is in the news today for another big deal: Wearing its private equity hat, the firm acquired Veeam for $5 billion. (And that speaks to a particular kind of trajectory for enterprise companies that the firm backs: Veeam had already been a part of Insight’s venture portfolio.)

Mature enterprise startups have proven their business cases are going to be an ongoing theme in this year’s fundraising stories, and Sisense is part of that theme, with annual recurring revenues of over $100 million speaking to its stability and current strength. The company has also made some key acquisitions to boost its business, such as the acquisition of Periscope Data last year (coincidentally, also for $100 million, I understand).

Its rise also speaks to a different kind of trend in the market: In the wider world of business intelligence, there is an increasing demand for more digestible data in order to better tap advances in data analytics to use it across organizations. This was also one of the big reasons why Salesforce gobbled up Tableau last year for a slightly higher price: $15.7 billion.

Sisense, bringing in both sleek end user products but also a strong theme of harnessing the latest developments in areas like machine learning and AI to crunch the data and order it in the first place, represents a smaller and more fleet of foot alternative for its customers. “We found a way to make accessing data extremely simple, mashing it together in a logical way and embedding it in every logical place,” explained CEO Amir Orad to us in 2018.

“We have enjoyed watching the Sisense momentum in the past 12 months, the traction from its customers as well as from industry leading analysts for the company’s cloud native platform and new AI capabilities. That coupled with seeing more traction and success with leading companies in our portfolio and outside, led us to want to continue and grow our relationship with the company and lead this funding round,” said Jeff Horing, managing director at Insight Venture Partners, in a statement.

To note, Access Industries is an interesting backer which might also potentially shape up to be strategic, given its ownership of Warner Music Group, Alibaba, Facebook, Square, Spotify, Deezer, Snap and Zalando.

“Given our investments in market leading companies across diverse industries, we realize the value in analytics and machine learning and we could not be more excited about Sisense’s trajectory and traction in the market,” added Claltech’s Daniel Shinar in a statement.

Categories
artificial intelligence Brain Cancer Laser (Light Amplification by Stimulated Emission of Radiation) Nature Medicine (Journal) Orringer, Daniel A Surgery and Surgeons Tissue (Human) Tumors Uncategorized your-feed-healthcare

A.I. Comes to the Operating Room

Brain surgeons are bringing artificial intelligence and new imaging techniques into the operating room, to diagnose tumors as accurately as pathologists, and much faster, according to a report in the journal Nature Medicine.

The new approach streamlines the standard practice of analyzing tissue samples while the patient is still on the operating table, to help guide brain surgery and later treatment.

The traditional method, which requires sending the tissue to a lab, freezing and staining it, then peering at it through a microscope, takes 20 to 30 minutes or longer. The new technique takes two and a half minutes. Like the old method, it requires that tissue be removed from the brain, but uses lasers to create images and a computer to read them in the operating room.

“Although we often have clues based on preoperative M.R.I., establishing diagnosis is a primary goal of almost all brain tumor operations, whether we’re removing a tumor or just taking a biopsy,” said Dr. Daniel A. Orringer, a neurosurgeon at N.Y.U. Langone Health and the senior author of the report.

In addition to speeding up the process, the new technique can also detect some details that traditional methods may miss, like the spread of a tumor along nerve fibers, he said. And unlike the usual method, the new one does not destroy the sample, so the tissue can be used again for further testing.

The new process may also help in other procedures where doctors need to analyze tissue while they are still operating, such as head and neck, breast, skin and gynecologic surgery, the report said. It also noted that there is a shortage of neuropathologists, and suggested that the new technology might help fill the gap in medical centers that lack the specialty.

Video

Video player loading
Doctors demonstrated acquiring a tissue sample and putting it through the AI pipeline to test for brain cancer. Video by Michigan MedicineCreditCredit…Michigan Medicine

Algorithms are also being developed to help detect lung cancers on CT scans, diagnose eye disease in people with diabetes and find cancer on microscope slides. The new report brings artificial intelligence — so-called deep neural networks — a step closer to patients and their treatment.

The study involved brain tissue from 278 patients, analyzed while the surgery was still going on. Each sample was split, with half going to A.I. and half to a neuropathologist. The diagnoses were later judged right or wrong based on whether they agreed with the findings of lengthier and more extensive tests performed after the surgery.

The result was a draw: humans, 93.9 percent correct; A.I., 94.6 percent.

The study was paid for by the National Cancer Institute, the University of Michigan and private foundations. Dr. Orringer owns stock in the company that made the imaging system, as do several co-authors, who are company employees. He conducted the research at the University of Michigan, before moving to New York.

“Having an accurate intra-operative diagnosis is going to be very useful,” said Dr. Joshua Bederson, the chairman of neurosurgery for the Mount Sinai Health System, who was not involved in the study. He added, “I think they understated the significance of this.”

He said the traditional method of examining tissue during brain surgery, called a frozen section, often took much longer than 30 minutes, and was often far less accurate than it was in the study. At some centers, he said, brain surgeons do not even order frozen sections because they do not trust them and prefer to wait for tissue processing after the surgery, which may take weeks to complete.

“The neuropathologists I work with are outstanding,” Dr. Bederson said. “They hate frozen sections. They don’t want us to make life-altering decisions based on something that’s not so reliable.”

Dr. Bederson said that the study authors had set a very high bar for their new technique by pitting it against experts at three medical centers renowned for excellence in neurosurgery and neuropathology: Columbia University in New York, the University of Miami and the University of Michigan, Ann Arbor.

“I think that what happened with this study is that because they wanted to do a good comparison, they had the best of the best of the traditional method, which I think far exceeds what’s available in most cases,” Dr. Bederson said.

The key to the study was the use of lasers to scan tissue samples with certain wavelengths of light, a technique called stimulated Raman histology. Different types of tissue scatter the light in distinctive ways. The light hits a detector, which emits a signal that a computer can process to reconstruct the image and identify the tissue.

The system also generates virtual images similar to traditional slides that humans can examine.

The researchers used images from tissue from 415 brain surgery patients to train an artificial intelligence system to identify the 10 most common types of brain tumor.

Some types of brain tumor are so rare that there is not enough data on them to train an A.I. system, so the system in the study was designed to essentially toss out samples it could not identify.

Over all, the system did make mistakes: It misdiagnosed 14 cases that the humans got right. And the doctors missed 17 cases that the computer got right.

“I couldn’t have hoped for a better result,” Dr. Orringer said. “It’s exciting. It says the combination of an algorithm plus human intuition improves our ability to predict diagnosis.”

In his own practice, Dr. Orringer said that he often used the system to determine quickly whether he had removed as much of a brain tumor as possible, or should keep cutting.

“If I have six questions during an operation, I can get them answered without having six times 30 or 40 minutes,” he said. “I didn’t do this before. It’s a lot of burden to the patient to be under anesthesia for that long.”

Dr. Bederson said that he had participated in a pilot study of a system similar to the one in the study and wanted to use it, and that his hospital was considering acquiring the technology.

Categories
artificial intelligence Breast Cancer Cancer Google Inc Mammography Mozziyar Etemadi Nature (Journal) Uncategorized your-feed-healthcare

A.I. Is Learning to Read Mammograms

Artificial intelligence can help doctors do a better job of finding breast cancer on mammograms, researchers from Google and medical centers in the United States and Britain are reporting in the journal Nature.

The new system for reading mammograms, which are X-rays of the breast, is still being studied and is not yet available for widespread use. It is just one of Google’s ventures into medicine. Computers can be trained to recognize patterns and interpret images, and the company has already created algorithms to help detect lung cancers on CT scans, diagnose eye disease in people with diabetes and find cancer on microscope slides.

“This paper will help move things along quite a bit,” said Dr. Constance Lehman, director of breast imaging at the Massachusetts General Hospital in Boston, who was not involved in the study. “There are challenges to their methods. But having Google at this level is a very good thing.”

Tested on images where the diagnosis was already known, the new system performed better than radiologists. On scans from the United States, the system produced a 9.4 percent reduction in false negatives, in which a mammogram is mistakenly read as normal and a cancer is missed. It also provided a lowering of 5.7 percent in false positives, where the scan is incorrectly judged abnormal but there is no cancer.

On mammograms performed in Britain, the system also beat the radiologists, reducing false negatives by 2.7 percent and false positives by 1.2 percent.

Google paid for the study, and worked with researchers from Northwestern University in Chicago and two British medical centers, Cancer Research Imperial Centre and Royal Surrey County Hospital.

Last year, 268,600 new cases of invasive breast cancer and 41,760 deaths were expected among women in the United States, according to the American Cancer Society. Globally, there are about 2 million new cases a year, and more than half a million deaths.

About 33 million screening mammograms are performed each year in the United States. The test misses about 20 percent of breast cancers, according to the American Cancer Society, and false positives are common, resulting in women being called back for more tests, sometimes even biopsies.

Doctors have long wanted to make mammography more accurate.

“There are many radiologists who are reading mammograms who make mistakes, some well outside the acceptable margins of normal human error,” Dr. Lehman said.

To apply artificial intelligence to the task, the authors of the Nature report used mammograms from about 76,000 women in Britain and 15,000 in the United States, whose diagnoses were already known, to train computers to recognize cancer.

Then, they tested the computers on images from about 25,000 other women in Britain, and 3,000 in the United States, and compared the system’s performance with that of the radiologists who had originally read the X-rays. The mammograms had been taken in the past, so the women’s outcomes were known, and the researchers could tell whether the initial diagnoses were correct.

“We took mammograms that already happened, showed them to radiologists and asked, ‘Cancer or no?’ and then showed them to A.I., and asked, ‘Cancer, or no?’” said Dr. Mozziyar Etemadi, an author of the study from Northwestern University.

This was the test that found A.I. more accurate than the radiologists.

Unlike humans, computers do not get tired, bored or distracted toward the end of a long day of reading mammograms, Dr. Etemadi said.

In another test, the researchers pitted A.I. against six radiologists in the United States, presenting 500 mammograms to be interpreted. Over all, A.I. again outperformed the humans.

But in some instances, A.I. missed a cancer that all six radiologists found — and vice versa.

“There’s no denying that in some cases our A.I. tool totally gets it wrong and they totally get it right,” Dr. Etemadi said. “Purely from that perspective it opens up an entirely new area of inquiry and study. Why is it that they missed it? Why is it that we missed it?”

Dr. Lehman, who is also developing A.I. for mammograms, said the Nature report was strong, but she had some concerns about the methods, noting that the patients studied might not be a true reflection of the general population. A higher proportion had cancer, and the racial makeup was not specified. She also said that “reader” analyses involving a small number of radiologists — this study used six — were not always reliable.

The next step in the research is to have radiologists try using the tool as part of their routine practice in reading mammograms. New techniques that pass their initial tests with flying colors do not always perform as well out in the real world.

“We have to see what happens when radiologists have it, see if they do better,” Dr. Etemadi said.

Dr. Lehman said: “We have to be very careful. We want to make sure this is helping patients.”

She said an earlier technology, computer-aided detection, or CAD, provided a cautionary tale. Approved in 1998 by the Food and Drug Administration to help radiologists read mammograms, it came into widespread use. Some hospital administrators pressured radiologists to use it whether they liked it or not because patients could be charged extra for it, increasing profits, Dr. Lehman said. Later, several studies, including one that Dr. Lehman was part of, found that CAD did not improve the doctors’ accuracy and even made them worse.

“We can learn from the mistakes with CAD and do it better,” Dr. Lehman said, adding that A.I. has become far more powerful, and keeps improving as more data is fed in. “Using computers to enhance human performance is long overdue.”

She and Dr. Etemadi said that a potentially good use of A.I. would be to sort mammograms and flag those most in need of the radiologist’s attention. The system may also be able to identify those that are clearly negative, so they could be read quickly and patients could promptly be given a clean bill of health.

Although developers of A.I. often say it is intended to help radiologists, not replace them, Dr. Lehman predicted that eventually, computers alone will read at least some mammograms, without help from humans.

“We’re onto something,” she said. “These systems are picking up things a human might not see, and we’re right at the beginning of it.”

[Like the Science Times page on Facebook. | Sign up for the Science Times newsletter.]

Categories
artificial intelligence Enterprise Funding Idea Fund Partners InsightFinder monitoring Recent Funding Startups TC

InsightFinder gets a $2M seed to automate outage prevention

InsightFinder, a startup from North Carolina based on 15 years of academic research, wants to bring machine learning to system monitoring to automatically identify and fix common issues. Today, the company announced a $2 million seed round.

IDEA Fund Partners, a VC out of Durham, N.C.,​ led the round, with participation from ​Eight Roads Ventures​ and Acadia Woods Partners. The company was founded by North Carolina State University professor Helen Gu, who spent 15 years researching this problem before launching the startup in 2015.

Gu also announced that she had brought on former Distil Networks co-founder and CEO Rami Essaid to be chief operating officer. Essaid, who sold his company earlier this year, says his new company focuses on taking a proactive approach to application and infrastructure monitoring.

“We found that these problems happen to be repeatable, and the signals are there. We use artificial intelligence to predict and get out ahead of these issues,” he said. He adds that it’s about using technology to be proactive, and he says that today the software can prevent about half of the issues before they even become problems.

If you’re thinking that this sounds a lot like what Splunk, New Relic and Datadog are doing, you wouldn’t be wrong, but Essaid says that these products take a siloed look at one part of the company technology stack, whereas InsightFinder can act as a layer on top of these solutions to help companies reduce alert noise, track a problem when there are multiple alerts flashing and completely automate issue resolution when possible.

“It’s the only company that can actually take a lot of signals and use them to predict when something’s going to go bad. It doesn’t just help you reduce the alerts and help you find the problem faster, it actually takes all of that data and can crunch it using artificial intelligence to predict and prevent [problems], which nobody else right now is able to do,” Essaid said.

For now, the software is installed on-prem at its current set of customers, but the startup plans to create a SaaS version of the product in 2020 to make it accessible to more customers.

The company launched in 2015, and has been building out the product using a couple of National Science Foundation grants before this investment. Essaid says the product is in use today in 10 large companies (which he can’t name yet), but it doesn’t have any true go-to-market motion. The startup intends to use this investment to begin to develop that in 2020.

Categories
artificial intelligence Enterprise Mobile otter.ai SaaS Startups TC

Extra Crunch members get 25% off Otter.ai voice meeting notes

Extra Crunch community perks have a new offer from voice meeting notes service, Otter.ai. Starting today, annual and two-year Extra Crunch members can receive 25% off an annual plan for Otter Premium or Otter for Teams.

Otter.ai is an AI-powered assistant that generates rich notes from meetings, interviews, lectures and other voice conversations. You can record, review, search and edit the notes in real time, and organize the conversations from any device. We also use Otter.ai regularly here at TechCrunch to produce transcripts and voice notes from panels at our events, and it’s a great way to easily organize and search the conversations. Learn more about Otter.ai here

To qualify for the Otter.ai community perk from Extra Crunch, you must be an annual or two-year Extra Crunch member. The 25% discount only applies to annual plans with Otter.ai, but it can be used for either the Premium or Teams plan. You can learn more about the pricing for Otter.ai here, and you can sign up for Extra Crunch here.

Extra Crunch is a membership program from TechCrunch that features how-tos and interviews on company building, intelligence on the most disruptive opportunities for startups, an experience on TechCrunch.com that’s free of banner ads, discounts on TechCrunch events and several community perks like the one mentioned in this article. Our goal is to democratize information about startups, and we’d love to have you join our community.

You can sign up for Extra Crunch here.

After signing up for an annual or two-year Extra Crunch membership, you’ll receive a welcome email with a link to sign up for Otter.ai and claim the discount. Otter.ai offers a free plan with capped minutes, and if you are interested in unlocking the full potential, you can purchase the annual plan with the 25% discount.

If you are already an annual or two-year Extra Crunch member, you will receive an email with the offer at some point over the next 24 hours. If you are currently a monthly Extra Crunch subscriber and want to upgrade to annual in order to claim this deal, head over to the “my account” section on TechCrunch.com and click the “upgrade” button.

This is one of several community perks we’ve launched for Extra Crunch annual members. Other community perks include a 20% discount on TechCrunch events, 100,000 Brex rewards points upon credit card sign up and an opportunity to claim $1,000 in AWS credits. For a full list of perks from partners, head here.

If there are other community perks you want to see us add, please let us know by emailing travis@techcrunch.com.

Sign up for an annual Extra Crunch membership today to claim this community perk. You can purchase an annual Extra Crunch membership here.

Disclaimer:

This offer is provided as a business partnership between TechCrunch and Otter.ai, but it is not an endorsement from the TechCrunch editorial team. TechCrunch’s business operations remain separate to ensure editorial integrity. 

Categories
AI December 2019 artificial intelligence Computers and the Internet Education (K-12) Uncategorized

The Machines Are Learning, and So Are the Students

This article is part of our continuing Fast Forward series, which examines technological, economic, social and cultural shifts that happen as businesses evolve.

Jennifer Turner’s algebra classes were once sleepy affairs and a lot of her students struggled to stay awake. Today, they are active and engaged, thanks to new technologies, including an artificial intelligence-powered program that is helping her teach.

She uses the platform Bakpax that can read students’ handwriting and auto-grade schoolwork, and she assigns lectures for students to watch online while they are at home. Using the platform has provided Mrs. Turner, 41, who teaches at the Gloucester County Christian School in Sewell, N.J., more flexibility in how she teaches, reserving class time for interactive exercises.

“The grades for homework have been much better this year because of Bakpax,” Mrs. Turner said. “Students are excited to be in my room, they’re telling me they love math, and those are things that I don’t normally hear.”

For years, people have tried to re-engineer learning with artificial intelligence, but it was not until the machine-learning revolution of the past seven years that real progress has been made. Slowly, algorithms are making their way into classrooms, taking over repetitive tasks like grading, optimizing coursework to fit individual student needs and revolutionizing the preparation for College Board exams like the SAT. A plethora of online courses and tutorials also have freed teachers from lecturing and allowed them to spend class time working on problem solving with students instead.

While that trend is helping people like Mrs. Turner teach, it has just begun. Researchers are using A.I. to understand how the brain learns and are applying it to systems that they hope will make it easier and more enjoyable for students to study. Machine-learning powered systems not only track students’ progress, spot weaknesses and deliver content according to their needs, but will soon incorporate humanlike interfaces that students will be able to converse with as they would a teacher.

“Education, I think, is going to be the killer app for deep learning,” said Terrence Sejnowski, who runs the Computational Neurobiology Laboratory at the Salk Institute for Biological Studies in La Jolla, Calif., and also is the president of the Neural Information Processing Systems Foundation, which each year puts on the largest machine-learning conference in the world.

It is well established that the best education is delivered one-to-one by an experienced educator. But that is expensive and labor intensive, and cannot be applied at the scale required to educate large populations. A.I. helps solve that.

The first computer tutoring systems appeared in the 1960s, presenting material in short segments, asking students questions as they moved through the material and providing immediate feedback on answers. Because they were expensive and computers far from ubiquitous, they were largely confined to research institutes.

By the 1970s and 1980s systems began using rule-based artificial intelligence and cognitive theory. These systems led students through each step of a problem, giving hints from expert knowledge bases. But rule-based systems failed because they were not scalable — it was expensive and tedious to program extensive domain expertise.

Since then, most computer teaching systems have been based on decision trees, leading students through a preprogrammed learning path determined by their performance — if they get a question right, they are sent in one direction, and if they get the question wrong, they are sent in another. The system may look like it is adapting to the student, but it is actually just leading the student along a preset path.

But the machine-learning revolution is changing that. Today, learning algorithms uncover patterns in large pools of data about how students have performed on material in the past and optimize teaching strategies accordingly. They adapt to the student’s performance as the student interacts with the system. Bakpax asks teachers to notify parents how their children’s data will be used, and parents can opt out. But Bakpax and other companies say they mask identities and encrypt the data they do collect.

Studies show that these systems can raise student performance well beyond the level of conventional classes and even beyond the level achieved by students who receive instruction from human tutors. A.I. tutors perform better, in part, because a computer is more patient and often more insightful.

One of the first commercial applications of machine learning to teaching was by the company Knewton, founded by Jose Ferreira, a former executive at the private education company Kaplan. Knewton uses a mix of learning algorithms to evaluate students and match material to their needs.

“After a few questions we could very quickly figure out what level you are at and the optimal piece of content for teaching,” Mr. Ferreira said. “The more you worked with the system, the better our profile of you got and the more we could give you better and better content.”

Nonetheless, Knewton ran into financial difficulties and was sold in May to the education publisher Wiley. Mr. Ferreira said the company’s troubles were not because its technology did not work, but because the company had relied heavily on one customer, which dropped Knewton in favor of an in-house system. Mr. Ferreira, 51, left to start Bakpax.

At its core, Bakpax is a computer vision system that converts handwriting to text and interprets what the student meant to say. The system’s auto-grader teaches itself how to score.

“Instead of handing your homework in, you just take a picture of it on your phone, and a few seconds later we can tell you what you got right and what you got wrong,” Mr. Ferreira said. “We can even tell you what the right answer is for the ones you got wrong.”

Mrs. Turner said her students loved the immediacy. The system also gathers data over time that allows teachers to see where a class is having trouble or compare one class’s performance with another. “There’s a lot of power in all this information that, right now, literally is just thrown in the trash every day,” Mr. Ferreira said.

Not surprisingly, machine-learning solutions are making their way into the test preparation market, a multibillion-dollar global industry. Riiid, a Korean start-up, is using reinforcement learning algorithms — which learn on their own to reach a specified goal — to maximize the probability of a student achieving a target score in a given time constraint.

Riiid claims students can increase their scores by 20 percent or more with just 20 hours of study. It has already incorporated machine-learning algorithms into its program to prepare students for English-language proficiency tests and has introduced test prep programs for the SAT. It expects to enter the United States in 2020.

Still more transformational applications are being developed that could revolutionize education altogether. Acuitus, a Silicon Valley start-up, has drawn on lessons learned over the past 50 years in education — cognitive psychology, social psychology, computer science, linguistics and artificial intelligence — to create a digital tutor that it claims can train experts in months rather than years.

Acuitus’s system was originally funded by the Defense Department’s Defense Advanced Research Projects Agency for training Navy information technology specialists. John Newkirk, the company’s co-founder and chief executive, said Acuitus focused on teaching concepts and understanding.

The company has taught nearly 1,000 students with its course on information technology and is in the prototype stage for a system that will teach algebra. Dr. Newkirk said the underlying A.I. technology was content-agnostic and could be used to teach the full range of STEM subjects.

Dr. Newkirk likens A.I.-powered education today to the Wright brothers’ early exhibition flights — proof that it can be done, but far from what it will be a decade or two from now.

The world will still need schools, classrooms and teachers to motivate students and to teach social skills, teamwork and soft subjects like art, music and sports. The challenge for A.I.-aided learning, some people say, is not the technology, but bureaucratic barriers that protect the status quo.

“There are gatekeepers at every step,” said Dr. Sejnowski, who together with Barbara Oakley, a computer-science engineer at Michigan’s Oakland University, created a massive open online course, or MOOC, called “Learning How to Learn.”

He said that by using machine-learning systems and the internet, new education technology would bypass the gatekeepers and go directly to students in their homes. “Parents are figuring out that they can get much better educational lessons for their kids through the internet than they’re getting at school,” he said.

Craig S. Smith is a former correspondent for The Times and hosts the podcast Eye on A.I.

Categories
artificial intelligence Australia Cisco cisco systems Cloud computing deep packet inspection economy Enterprise Exablaze fpga high-frequency trading London M&A New York semiconductor shanghai Software sydney telecommunications

Cisco acquires ultra-low latency networking specialist Exablaze

Cisco today announced that it has acquired Exablaze, an Australia-based company that designs and builds advanced networking gear based on field programmable gate arrays (FPGAs). The company focuses on solutions for businesses that need ultra-low latency networking, with a special emphasis on high-frequency trading. Cisco plans to integrate Exablaze’s technology into its own product portfolio.

“By adding Exablaze’s segment leading ultra-low latency devices and FPGA-based applications to our portfolio, financial and HFT customers will be better positioned to achieve their business objectives and deliver on their customer value proposition,” writes Cisco’s head of corporate development Rob Salvagno.

Founded in 2013, Exablaze has offices in Sydney, New York, London and Shanghai. While financial trading is an obvious application for its solutions, the company also notes that it has users in the big data analytics, high-performance computing and telecom space.

Cisco plans to add Exablaze to its Nexus portfolio of data center switches. The company also argues that in addition to integrating Exablaze’s current portfolio, the two companies will work on next-generation switches, with an emphasis on creating opportunities for expanding its solutions into AI and ML segments.

“The acquisition will bring together Cisco’s global reach, extensive sales and support teams, and broad technology and manufacturing base, with Exablaze’s cutting-edge low-latency networking, layer 1 switching, timing and time synchronization technologies, and low-latency FPGA expertise,” explains Exablaze co-founder and chairman Greg Robinson.

Cisco, which has always been quite acquisitive, has now made six acquisitions this year. Most of these were software companies, but with Acacia Communications, it also recently announced its intention to acquire another fabless semiconductor company that builds optical interconnects.