11 Unicorn Companies Using Machine Learning to Tackle Real World Problems

How Companies like Quora, Twitter, Apple, eBay, Google, Uber uses Machine Learning, Artificial Intelligence, Algorithms, Data Science and Neural Networks

After a long time, we're back with something rare i.e. "What are the different types of Machine Learning Algorithms, Neural Networks and other parts of Artificial Intelligence and Machine Learning used in solving some real world problem in tech giants". Today we'll discuss this topic by taking examples like which algorithms and different stuffs quora uses to rank answers, to store data, etc. or Where Twitter uses different machine learning models etc. So, let's see how companies using machine learning to solve real world problems.

How do you discover content from around the web related to Machine Learning? You may be reading content from different websites to newsletters to RSS feeds to any social media. You increased the diversity but also noise. It's difficult, Right? Let's fix the way you consume content. Stay up-to-date, ahead of the curve, and get smarter every day. Don't wait, Download the app today! Reinvent the way you feed your curiosity!

Starting with Quora...

Where and How Quora uses Machine Learning and Artificial Intelligence
Image Credits: Quora

👉 For Ranking Answers [Quora] - Quora uses linear regression, logistic regression, random forests, gradient boosted trees, neural networks. In particular, gradient boosted trees and some deep learning approaches. They also use other factors like number of upvotes, the previous answers written by the author, etc.

To Store Data: Quora use MySQL to store critical data such as questions, answers, upvotes, and comments. Data size stored in MySQL is of the order of tens of TB without counting replicas. Our queries per second are in the order of hundreds of thousands. As an aside, we also store a lot of other data in HBase. If MySQL is slow or unresponsive, the Quora site is severely impacted.

Over the years, MySQL usage at Quora has grown in multiple dimensions including the number of tables, table sizes, read QPS (queries per second), write QPS, etc. In order to improve performance, we implemented caching using Memcache as well as Redis. The support in Redis for data structures such as lists is a big reason for using both the caching systems. While the addition of caching helped with our increasing volume of reads, the growth of data size and write QPS led us to look into sharding MySQL.

To combine questions: The company uses Natural Language processing to combine questions with exact same wording/almost similar intended meaning. You can see it’s efficiency by observing so many similar questions out there.

Other Platforms and Tools: According to a software engineer at Quora, these are the tools and platforms quora uses to build containers, export logs, cluster building, etc

- Amazon EKS helped them to get off the ground quickly without having to worry about etcd

- Terraform enables them to manage EKS clusters and associated AWS resources using code

- They use Skaffold as a common tool for local development and deployment to production

- BuildKit helps them with smart and concurrent in-cluster builds

- Kustomize has been their primary means of varying resource configuration across multiple environments

- Prometheus helped them to perform well for collecting and providing time series metrics

- Filebeat, while requiring more memory than Fluent Bit, integrates well with Elasticsearch for exporting logs 

(Source: Engineering at Quora)


Where and How Twitter uses Machine Learning and Artificial Intelligence
Image Credits: Twitter

👉 ML at Twitter: At Twitter, we use machine learning (ML) models in many applications, from ad selection, to abuse detection, to content recommendations and beyond. For example, Every second on Twitter, machine learning models are performing tens of millions of predictions to better understand user engagement and ad relevance. The company has also recently deployed a SplitNet model in the light ranking stage, where information on user requests and ads are represented by fixed-length embeddings. 

In a social network like Twitter, when a person joins the platform, a new node is created. When they follow another person, a follow edge is created. When they change their profile, the node is updated. This stream of events is taken by an encoder neural network that produces a time-dependent embedding for each node of the graph. The embedding can then be fed into a decoder that is designed for a specific task like predicting future interactions by trying to answer various questions.

(Source: Twitter Engineering Blog)


Where and How Apple uses Machine Learning and Artificial Intelligence
Image Credits: Apple

👉 Speech Recognition [Apple's Siri] - Siri was using multivariate gaussian mixture models (GMM’s) till 2011, replaced by hidden Markov Models (HMM’s) till 2014, and updated it with Long Short Term Memory Networks from 2014 till now. "Hey Siri" detector uses a Deep Neural Network (DNN) to convert the acoustic pattern of your voice at each instant into a probability distribution over speech sounds. 

The giant unicorn has introduced a scale-invariant convolution layer and used it as the main component of their tempo-invariant neural network architecture for downbeat tracking. This helps company to achieve higher accuracy with lower capacity compared to a standard CNN.

Recently, Few months ago, Apple has proposed a meta-learning framework under which the labels are treated as learnable parameters, and are optimized along with model parameters. The learned labels take into account the model state, and provide dynamic regularization, thereby improving generalization. They considered two categories of soft labels, class labels specific to each class, and instance labels specific to each instance. In case of supervised learning, training with dynamically learned labels leads to improvements across different datasets and architectures. In presence of noisy annotations in the dataset, their framework corrects annotation errors, and improves over the state-of-the-art.

(Source: Apple Machine Learning)


Where and How eBay uses Machine Learning and Artificial Intelligence
Image Credits: eBay

👉 Reverse Image Search and Category Detection [eBay]eBay Shop Bot uses reverse image search to find products by a user uploaded photo. To find the category of that product, the company uses ResNet-50 Convolutional Neural Network 

eBay is leading the industry by applying automatic machine translation to commerce. When buyers search in these countries, eBay translates their request, and responds with relevant translated inventory from other countries in other languages. 

ML at eBay: eBay implements machine learning techniques to item-to-product matching, price prediction and item categorization tasks, Kopru said, A research scientist at eBay. They also employ them for attribute extraction, generating the proper names of browse nodes, filtering product reviews and more. Machine learning helps them optimize the relevance of shoppers’ search and navigation experiences.”

eBay's Best Match: The largest scale application of machine learning technology at eBay is currently Best Match, the algorithm used to optimize relevance for buyers during their shopping experiences. Best Match analyzes everything from item popularity to potential value to the buyer, to terms of service such as return policies. 

eBay deploys AI in various areas, from structured data to machine translation to risk and fraud management. 

(Source: Wikipedia and eBay Inc Blog)


Where and How Snapchat uses Machine Learning and Artificial Intelligence
Image Credits: Snapchat

👉 For Face Detection and Filters [Snapchat] - In a research paper, published on Arxiv, the company seems to detail one of its tricks for compressing crucial image recognition AI while still maintaining acceptable performance. This image recognition software and algorithms like Viola-Jonesare used by the company for Face Detection. To get the facial features and apply filters accordingly, Snapchat uses Active Shape Models and Image Processing. 

Ads Recommendation: Exactly what stories end up at the top of your newsfeed depends a lot on how you use Snapchat. And with every swipe and tap, the A.I. learns a little more. Snap then turns that data into something it can sell to people who make Snap stories, who want to make sure the effort they put into making stories turns into clicks.

“Search surfaces interesting Stories created by machine learning and allows our community to find Stories for anything they might be interested in,” Spiegel said, CEO of Snap.

Along with that, Snap’s algorithm pays attention to whether you have a preference for certain published stories, and then starts to prefer to show you those publishers.

(Source: Forbes, Inverse and Quartz)


Where and How Deepmind AlphaGo uses Machine Learning and Artificial Intelligence
Image Credits: DeepMind Blog

👉 Algorithm used in AlphaGo - AlphaGo combines machine learning and advanced tree search with deep neural networks. These neural networks take a description of the Go board as an input and process it through 12 different network layers containing millions of neuron-like connections. One neural network, the “policy network,” selects the next move to play. The other neural network, the “value network,” predicts the winner of the game. So, this is about Alpha Go. Let's talk about AlphaGo Zero (A version of AlphaGo Zero).

AlphaGo Zero (A version created without using data from human games): AlphaGo Zero's neural network was trained using TensorFlow, with 64 GPU workers and 19 CPU parameter servers. Only four TPUs were used for inference. The system starts off with a neural network that knows nothing about the game of Go. It then plays games against itself, by combining this neural network with a powerful search algorithm. As it plays, the neural network is tuned and updated to predict moves, as well as the eventual winner of the games. Hey wait, Another version is also their (AlphaGo Zero) but its similar to this one but with more powerful training and its the most advanced version till yet)

(Source: Deepmind Blog, Google Blog, Wikipedia)


Where and How Google Translator uses Machine Learning and Artificial Intelligence
Image Credits: Google Blog

👉 Machine Translation [Google Translator] - Till 2016, The deep-learning approach of Google’s neural machine translation relies on a type of software algorithm known as a recurrent neural network. In a recently published blog (in June 2020), Advances in machine learning (ML) have driven improvements to automated translation, including the GNMT neural translation model introduced in Translate in 2016, that have enabled great improvements to the quality of translation for over 100 languages. 

(Source: Google AI Blog)


Where and How Netflix uses Machine Learning and Artificial Intelligence
Image Credits: Netflix Tech Blog

👉 User Rating Prediction & ODE [Netflix]Restricted Boltzmann Machines (RBMs) were used in Netflix to improve the prediction of user ratings for movies based on collaborative filtering. To provide an optimal dimensional embedding to its users, the company has implemented a factorization model which is popularly known as Singular Value Decomposition (SVD). 

We’re also using machine learning to help shape our catalog of movies and TV shows by learning characteristics that make content successful. We use it to optimize the production of original movies and TV shows in Netflix’s rapidly growing studio. Machine learning also enables us to optimize video and audio encoding, adaptive bitrate selection, and our in-house Content Delivery Network that accounts for more than a third of North American internet traffic. It also powers our advertising spend, channel mix, and advertising creative so that we can find new members who will enjoy Netflix. In addition, they also use different techniques and models like causal modeling, bandits, reinforcement learning, ensembles, neural networks, probabilistic graphical models, and matrix factorization

(Source: Netflix Tech Blog and Netflix Research)


Where and How Facebook uses Machine Learning and Artificial Intelligence
Image Credits: InfrAI

👉 Detection and Interpretation [Facebook] - Facebook uses DeepText, a text understanding engine, to automatically understand and interpret the content and emotional sentiment of the thousands of posts (in multiple languages) that its users publish every second. With DeepFace, the social media giant can automatically identify you in a photo that is shared on their platform. 

Tulloch, An AI researcher at Facebook says that the company uses an NLP system built around neural networks to identify posts that are excessively promotional, spam or clickbait. The deep learning model filters these types of posts out and keeps them from showing in users' news feeds. Outside of the news feed, deep learning models are helping Facebook develop products by enabling developers to understand content at a large scale.

(Source: Facebook AI Blog, Forbes and TechTarget)


👉 Customer Behavior & Loan Process [HSBC]HSBC is just one bank using Artificial Neural Networks to transform how loan and mortgage applications are processed. The company uses neural networks to analyse customers with previous behavior patterns.

HSBC recently began utilizing artificial intelligence (AI) from Quantexa (a big data startup acquired by HSBC), to track and tackle money laundering. The company uses Artificial Intelligence in the form of chatbot (Amy) to answer customer queries, to speed up services, which will save firms a great deal of money in the long run.

(Source: Algorithm X Lab)


Where and How Google uses Machine Learning and Artificial Intelligence
Image Credits: Google AI Blog

👉 Google - the search engine that is powered by AI [Google] - According to Wired’s Cade Metz; Google’s search engine was always driven by algorithms that automatically generate a response to each query. But these algorithms amounted to a set of definite rules. Google engineers could readily change and refine these rules. And unlike neural nets, these algorithms didn’t learn on their own. But now, Google has incorporated deep learning into its search engine. And with its head of AI taking over search, the company seems to believe this is the way forward. 

Just take a look at below given image. It shows how deeply AI in integrated in most of the Google products.

Where and How Google Uses Artificial Intelligence and Machine Learning
Image Credits: AI Amplitude

ML in Google Products: Google Ads and Doubleclick both incorporate Smart Bidding which is a machine learning powered automated bidding system. YouTube Safe Content uses machine learning techniques to ensure that brands are not displayed next to offensive content. Google Chrome uses AI to present short and highly related parts of a video while searching for something in Google Search. And it also uses artificial intelligence to analyze the images on a website and plays an audio description or the alt text (when available) for people who are blind or have low vision.

(Source: Wired Magazine and Google Blog)


Where and How Uber Uses Artificial Intelligence and Machine Learning
Image Credits: Uber Engineering Blog

👉 CoordConv - A New CNN [Uber] - Uber uses CoordConv layer in many domains that involves coordinate transforms, from designing self-driving vehicles to automating street sign detection to build maps and maximizing the efficiency of spatial movements in the Uber Marketplace.

Ludwig and Uber's project: Uber uses Ludwig (a code free deep learning toolbox) in various projects, including Customer Obsession Ticket Assistant (COTA) to reduce ticket resolution time, information extraction from driver licenses, identification of points of interest during conversations between driver-partners and riders, food delivery time prediction, and much more. 

Pipelines at Uber: Uber uses two pipelines i.e. Deep learning Spark Pipeline (DLSP) and Model life-cycle management Pipeline (MLMP). These pipelines not only help them to train and deploy deep learning models into Uber’s production system, but also retrain and refresh the models to keep them at peak performance.

Uber's One-click chat - OCC is only one out of a multitude of different NLP / Conversational AI initiatives at Uber. One-click chat (OCC) leverages Uber’s machine learning platform, Michelangelo, to perform NLP on rider chat messages, and generate appropriate responses.

PRDS inside Uber - Uber uses Plato Research Dialogue System (PRDS) to create, train, and evaluate conversational AI agents in various environments. It supports interactions through speech, text, or dialogue acts and each conversational agent can interact with data, human users, or other conversational agents.

(Source: Engineering at Uber)

Note: This article will be updated time to time, when we dive more in other giants like Neuralink, Reddit, Tesla, Tata, Alibaba, IBM, Nvidia, Microsoft, Nasa, Spotify, Starbucks, Salesforce, Amazon, Dominos and other companies. So, stay tuned with us and if you like this article, please help us in exploring it to more and more people by sharing it with your friends and with your internet community. And thanks for reading till the end. If you want us to add anything in this article, share your important thoughts with us on any of our social media accounts. Your feedback matters to us!
February 25, 2021


Contact Us