Sunday 25 September 2016

Chatbots

Just as people use language for human communication, people want to use their language to communicate with computers. This led to the discovery of chatbots. Chatbots are computer programs that interact with users using natural language. They are also known as: machine conversation system, virtual agent, dialogue system or chatterbot. Chatbot architecture integrates a language model and computational algorithms. Let's have a look on the different chatbot systems:
  • ALICE chatbot system( Artificial Linguistic Internet Computer Entity):Alice’s knowledge about English conversation patterns is stored in AIML(Artificial Intelligence Markup Language) files. AIML consists of data objects called AIML objects. AIML objects are made up of units called topics and categories. The topic is an optional top-level element. It has a name and a set of categories related to that topic. Categories are the basic unit of knowledge in AIML. ALICE does not save the history of conversation.
  • Pandorabot chatbots: Pandorabots is a web service for building and deploying chatbots. Earlier in the development phase Pandorabot chatbots were text-only. But, now some Pandorabot chatbots incorporate speech synthesis.
  • ELIZA: The first attempt to build chatbot as a tool of entertainment is ELIZA. Here the responses are mainly generated from user input.
  • Sofia: This chatbot was used in Harward Mathematics Department to assist in teaching Mathematics.
  • YPA: Yellow pages contain advertisements, with the advertiser name, and contact information. YPA is a natural language dialogue system that allows users to retrieve information from British Telecom’s Yellow pages.
  • Virtual Patient bot (VPbot): VPbot simulates a patient that medical students can interview.It was successfully tested in Harvard Medical School’s virtual patient program.
  • Happy Assistant: It helps users access e-commerce sites to find relevant information about products and services.
  • Sanelma: It is a fictional person to talk with in a museum, which provides background information concerning a certain piece of art.
  • RITA(Real time Internet Technical Assistant): It is a graphical avatar used in ABN AMRO Bank to help customer for doing financial tasks.



Live to learn and you will really learn to live!

Friday 16 September 2016

Prisma & Machine Learning

Hello... You might be wondering how I switched the topic from Natural language API to deep learning! So let me make that clear. Whatever concepts I read everyday become's my new thought for that day. That's it. Even from the last couple of months I could see that most of my social media friends has a tag Prisma in their images. So I just thought of investigating the internals of the app. Prisma is a Russian app which makes use of neural networks to turn images into paintings. It is similar to Google's Deep Dream image recognition software. While we upload an image to the app, the image is transferred to its servers in Moscow. It uses Artificial Intelligence and Neural networks to process the image and the result is returned to the phone.
Deep learning is a branch of Machine Learning. It consist of set of algorithms to model high level abstractions on the data. Some of the deep learning architectures include: deep neural networks, convolutional neural networks, deep belief networks and recurrent neural networks.Neural networks are used to perform the tasks that can be easily performed by humans but difficult by machines. Neural networks acquire knowledge by learning and this information is used to model outputs for the future inputs.
The different learning strategies can be divided into three namely:
  • Supervised learning: This involves providing set of predefined inputs and outputs for learning. Eg: Face recognition
  • Unsupervised learning: It is used when we don't have an example dataset with known answers. Eg: Clustering
  • Reinforcement learning: This is a strategy built on observation. Eg: Robotics
Neural networks are used in the following fields:
  • Pattern recognition: The most common example is facial recognition
  • Time series prediction: Popular example is predicting the ups and downs of stock markets
  • Control: It involves the design of self driving cars
  • Signal Processing: one of the best example is designing of cochlear implants
Sources:

Every time we are being redirected to something better!

Monday 12 September 2016

Google Cloud Natural Language Processing API

Couple of months ago I got a mail from the Google Cloud team regarding their new product launch. Due to the inborn curiosity I started Googling to find what it is. So let me share my thoughts on the same.
Google is consistently making advancements in the machine learning field. In the last year it open sourced the software library for machine Learning named Tensorflow. Then in earlier this year they introduced SyntaxNet which is a neural-network Natural Language Processing framework for TensorFlow. Now the Cloud Natural Language Processing API.
This REST API reveals the structure and meaning of the text. Initially it supports the following Natural Language Processing tasks:
  • Entity Recognition: Identify the different entity types such as Person, Location, Organisation, Events etc. from the text.
  • Sentiment Analysis: Understand the overall sentiment of the given text.
  • Syntax Analysis: Identify Parts of Speech and create Dependency parse tree for the input sentence.
The primary languages supported by the API are English, Spanish and Japanese. It has connectors in Java, Python and node.js. One of the major Alpha customers for this API is the Ocado Technology which is a popular British online marketplace.
If we are particular that we need to use the Google stack for analytics then natural language processing can be done using the Google Cloud Natural Language API,the processing results can be kept in Big Query Table which is a RESTful webservice for data storage provided by Google and the visualization can be done using Google Data Studio. Please note that the Google Data Studio is currently available only in U.S.
Happy reading!


Dream up to the Stars so that you can end up in the Clouds.

Thursday 1 September 2016

Selfie Mining

Now a days it's common that personal stories are described using social images. We might be thinking the pictures we snapped of ourselves and posted on social media sites are just for our friends on those platforms. But it's high time to correct this misbelief. Only those data we mark as private are actually guarded by the privacy laws. The rest all is public. Marketers are grabbing our images for research. This process is called selfie mining.
When we take a picture of ourselves we do so without promoting a specific product in mind. But that is not the case with marketers. They might be interested in our clothing, products we use, emotions on our face etc. There are companies that mine for selfies. They use APIs to access the images and the most interesting aspect of it is that the owners are unaware of this. Actually intentionally or unintentionally selfies promote whatever we are wearing or are sitting near or using. Many digital marketing companies have built technology to scan and process photos, to identify particular interests or hobbies. This in turn helps to better target advertisers.
Two of these companies are Ditto Labs and Piqora.
Ditto Labs: It scans photos on different sites like Instagram to generate insights for customers. Ditto Labs places users into categories, such as “sports fans” and “foodies” based on the context of their images. Advertisers such as Kraft Foods Group Inc. pay Ditto Labs to find their products’ logos in photos on social media.The following aspects are taken into consideration:

  • Products- Users who post images of food items and beverages are flagged for these interests.
  • Clothing- Ditto classifies objects. It also detects fabrics or patterns in clothing.
  • Faces- The emotions in the face help advertisers to understand sentiment.
  • Logos- Advertisers can search for photos featuring brands to steal customers.
  • Scenes- Analysing the background of images helps the advertisers to find where and how customers use their products.

Piqora: They store images for months on their own servers to show marketers what is trending in popularity. Piqora mainly analyses images in Pinterest. It was recently acquired by Olapic which analyses images on Instagram.
Well, these indicates that some of the best digital marketing trends are all on the way. Let's hope that the best is yet to come in near future.

Source:
http://programmaticadvertising.org/2014/10/20/selfie-mining-whats-really-going-on/
http://www.wsj.com/articles/smile-marketing-firms-are-mining-your-selfies-1412882222
We anyways have to think, why not think big?