In this post, I will outline how I created a big data pipeline for my web server logs using Apache Kafka, Python, and Apache Cassandra.
In this post I will go over what a Python set is and how to use them in your Python programs. With sets we can manage data just like math sets.
In this post, we will discuss 3 awesome big data Python tools to increase your big data programming skills using production data.
In this post we will talk about creating Python Lists of Tuples and how they can be used.
In this post, I will talk about the Python Enumerate() function and give you a few examples on how to use it.
In this post, we will be aggregating all of our logs into Google BigQuery Audit Logs. Using big data techniques we can simply our audit log aggregation in the cloud.
Learn how to use Kafka Python to pull Google Analytics metrics and push them to your Kafka Topic. This will allow us to analyze this data later using Spark to give us meaningful business data.
In this Kafka python tutorial we will create a python application that will publish data to a Kafka topic and another app that will consume the messages.
In previous posts, I covered writing a Python REST API. In this article, we will test our API from within EMACS! We are going to install an EMACS rest client to do our testing. This will cut down your development time because you can do everything from one IDE (EMACS) and not have to switch to a browser or a another web client to test your REST API.
In this post we will build on our webservice that we made in my last series Python Rest API Example by adding the ability to pass JSON data to a HTTP POST.