You might remember the ad cost simulation I discussed in December. And you likely remember the past 1-2 weeks, where I’ve initiated a discussion about what I call File Power.Both are interrelated.When you craft a loyalty program, you are trying to increase File Power. You are trading margin dollars for customers.Read Full Story
Over the past 11 blogs in this series, I have discussed how to build machine learning models for Kaggle’s Denoising Dirty Documents competition.Read Full Story
We discussed this research as part of our virtual event on Wednesday, July 24th; you can watch the replay here!
Convolutional Neural Networks (CNNs or ConvNets) excel at learning meaningful representations of features and concepts within images. These capabilities make CNNs extremely valuable for solving problems in the image analysis domain.
In one of my previous posts I discussed how random forests can be turned into a “white box”, such that each prediction is decomposed into a sum of contributions from each feature i.e. (prediction = bias + feature_1 contribution + … + feature_n contribution).I’ve a had quite a few requests for code to do this.Read Full Story
The search veteran discussed how Google is entering new markets and what brands can do to insulate themselves.
Please visit Search Engine Land for the full article.
In the last blog, we have discussed the various attribution models available in most of the sophisticated analytics tools and we now know that considering the position and the frequency of occurrence of channels can be instrumental in designing…
The post Answering Every Marketer’s Dilemma – Which attribution model to choose?
A while back, I discussed Recurrent Neural Networks (RNNs), a type of artificial neural network in which some of the connections between neurons point “backwards”. When a sequence of inputs is fed into such a network, the backward arrows feed information about earlier input values back into the system at later steps.Read Full Story