This post going to be a bit longer as I am going to cover multiple concepts. I will be covering the following things: Smart Contracts and how do they work in Ethereum blockchain. The basics of Solidity Programming language and how to use online and existing IDEs to write and test them. Using Truffle and Ganache for Ethereum development environment setup. Web3.py helps to integrate Smart Contract with Python applications. What is a Smart Contract According to Investopedia: A smart contract is a self-executing contract with the terms of the agreement between buyer and seller being directly written into lines of code. The code and the agreements contained therein exist…
-
Develop and deploy your first Ethereum Smart Contract with Python
-
Create a crypto Telegram bot in Python using Yahoo Finance API
A step-by-step guide creating a Telegram bot in Python.So I was exploring Telegram APIs for a project someone asked me to work on it. The script was actually a cron job that would be sending messages on daily basis. While working on it I found that you could come up with your own commands that can pull data from some remote API and display the results to Telegram users. I found this an opportunity for my next blog post which I am writing here 🙂 Telegram is very much similar to WhatsApp for communication and it is quite popular among Crypto lovers. Many Crypto traders use both Telegram and Discord to send crypto and stock signals to their…
-
Getting started with Protobuffer and Python
In this post, I am going to talk about Proto Buffers and how you can use them in Python for passing messages across networks. Protocol Buffers or Porobuf in short, are used for data serialization and deserialization. Before I discuss Protobuf, I would like to discuss data serialization and serialization first. Data Serialization and De-serialization According to Wikipedia Serialization is the process of translating a data structure or object state into a format that can be stored (for example, in a file or memory data buffer) or transmitted (for example, over a computer network) and reconstructed later (possibly in a different computer environment) In simple words, you convert simplex and complex data structures and objects into byte…
-
Getting started with Elasticsearch 7 in Python
I had written about Elasticsearch almost 3 years ago in June 2018. During this time a new Elasticsearch version launched which has some new features and changes. I’d be repeating some concepts again in this post so one does not have to go to the old post to learn about it. So, let’s begin! What is ElasticSearch? ElasticSearch (ES) is a distributed and highly available open-source search engine that is built on top of Apache Lucene. It’s open-source which is built in Java thus available for many platforms. You store unstructured data in JSON format which also makes it a NoSQL database. So, unlike other NoSQL databases, ES also provides…
-
Using Sitemap to write efficient web scrapers
A step by step guide writing web scrapers without using extra resources.This post is the part of Scraping Series. Usually, when you start developing a scraper to scrape loads of records, your first step is usually to go to the page where all listings are available. You go to the page by page, fetch individual URLs, store in DB or in a file and then start parsing. Nothing wrong with it. The only issue is the wastage of resources. Say there are 100 records in a certain category. Each page has 10 records. Ideally, you will write a scraper that will go page by page and fetch all links. Then you will switch to the next category and repeat the process.…
-
Getting started with GraphQL in Python with FastAPI and Graphene
Learn how to create your first Python based GraphQL applicationThis post is part of the FastAPI series. This is another post related to FastAPI(indirectly) in which I am going to discuss how to use GraphQL based APIs to access and manipulate data. I already have discussed how you can make Rest API in the FastAPI framework in the previous post. We will be learning some basics of GraphQL and how graphene helps us to use Python for writing a GraphQL based application. So, let’s get started! What is GraphQL From the official website: GraphQL is a query language for APIs and a runtime for fulfilling those queries with your existing data. GraphQL provides a complete and understandable description of…
-
Getting started with FastAPI and MySQL
Learn how to create Db driven Rest APIs in FastAPIThis post is part of the FastAPI series. In the first post, I introduced you to FastAPI and how you can create high-performance Python-based applications in it. In this post, we are going to work on Rest APIs that interact with a MySQL DB. We will also be looking at how we can organize routers and models in multiple files to make them maintainable and easier to read. FastAPI does not strict you to use a certain database framework. You may use SQLAlchemy or any other you want. I’d prefer to use peewee since it’s more expressible and easier to use. Installing Peewee and MySQL Drivers Before start using MySQL…
-
Create your first REST API in FastAPI
A step by step guide creating high performance APIs in PythonThis post is part of the FastAPI series. In this post, I am going to introduce FastAPI: A Python-based framework to create Rest APIs. I will briefly introduce you to some basic features of this framework and then we will create a simple set of APIs for a contact management system. Knowledge of Python is very necessary to use this framework. Before we discuss the FastAPI framework, let’s talk a bit about REST itself. From Wikipedia: Representational state transfer (REST) is a software architectural style that defines a set of constraints to be used for creating Web services. Web services that conform to the REST architectural style, called RESTful Web services,…
-
Create Ebay Scraper in Python using Scraper API
Learn how to create an eBay data scraper in Python to fetch item details and price.In this post of ScrapingTheFamous, I am going o write a scraper that will scrape data from eBay. eBay is an online auction site where people put their listing up for selling stuff based on an auction. Like before, we will be writing the two scripts, one to fetch listing URs and store in a text file and the other to parse those links. The data will be stored in JSON format for further processing. I will be using Scraper API service for parsing purposes which makes me free from all worries blocking and rendering dynamic sites since it takes care of everything. The first script is to fetching listings of a category.…
-
Create Amazon Scraper in Python using Scraper API
Learn how to create an Amazon scraper in python to scrape product details like price, ASIN etcIn this post of ScrapingTheFamous, I am going o write a scraper that will scrape data from Amazon. I do not need to tell you what is Amazon. You are here because you already know about it 🙂 So, we are going to write two different scripts: one would be fetch.py that would be fetching URLs of individual listings and save in a text file. Later another script, parse.py that will have a function taking an individual listing URL, scrape data, and save in JSON format. I will be using Scraper API service for parsing purposes which makes me free from all worries blocking and rendering dynamic sites since it…