• 10 Ways to Save Costs on AWS

    If your organization runs workloads on Amazon Web Services (AWS), AWS is probably a significant part of your IT expenditure. Cutting costs is always a priority, so I put together a list of handy ways you can reduce waste and optimize your usage of AWS to significantly reduce costs.  1. AWS Trusted Advisor AWS Trusted Advisor is a tool that works online to provide you with real-time recommendations. The tool analyzes your AWS environment and then offers best practices you can implement to better utilize your resources. The Advisor offers guidance in five categories, including cost optimization, performance, fault tolerance, service limits, and security. It is up to you whether…

  • HTML

    Using Sitemap to write efficient web scrapers
    A step by step guide writing web scrapers without using extra resources.

    This post is the part of Scraping Series. Usually, when you start developing a scraper to scrape loads of records, your first step is usually to go to the page where all listings are available. You go to the page by page, fetch individual URLs, store in DB or in a file and then start parsing. Nothing wrong with it. The only issue is the wastage of resources. Say there are 100 records in a certain category. Each page has 10 records. Ideally, you will write a scraper that will go page by page and fetch all links. Then you will switch to the next category and repeat the process.…

  • Getting started with GraphQL in Python with FastAPI and Graphene
    Learn how to create your first Python based GraphQL application

    This post is part of the FastAPI series. This is another post related to FastAPI(indirectly) in which I am going to discuss how to use GraphQL based APIs to access and manipulate data. I already have discussed how you can make Rest API in the FastAPI framework in the previous post. We will be learning some basics of GraphQL and how graphene helps us to use Python for writing a GraphQL based application. So, let’s get started! What is GraphQL From the official website: GraphQL is a query language for APIs and a runtime for fulfilling those queries with your existing data. GraphQL provides a complete and understandable description of…

  • Getting started with FastAPI and MySQL
    Learn how to create Db driven Rest APIs in FastAPI

    This post is part of the FastAPI series. In the first post, I introduced you to FastAPI and how you can create high-performance Python-based applications in it. In this post, we are going to work on Rest APIs that interact with a MySQL DB. We will also be looking at how we can organize routers and models in multiple files to make them maintainable and easier to read. FastAPI does not strict you to use a certain database framework. You may use SQLAlchemy or any other you want. I’d prefer to use peewee since it’s more expressible and easier to use. Installing Peewee and MySQL Drivers Before start using MySQL…

  • Create your first REST API in FastAPI
    A step by step guide creating high performance APIs in Python

    This post is part of the FastAPI series. In this post, I am going to introduce FastAPI: A Python-based framework to create Rest APIs. I will briefly introduce you to some basic features of this framework and then we will create a simple set of APIs for a contact management system. Knowledge of Python is very necessary to use this framework. Before we discuss the FastAPI framework, let’s talk a bit about REST itself. From Wikipedia: Representational state transfer (REST) is a software architectural style that defines a set of constraints to be used for creating Web services. Web services that conform to the REST architectural style, called RESTful Web services,…

  • HTML

    Create Ebay Scraper in Python using Scraper API
    Learn how to create an eBay data scraper in Python to fetch item details and price.

    In this post of ScrapingTheFamous, I am going o write a scraper that will scrape data from eBay. eBay is an online auction site where people put their listing up for selling stuff based on an auction. Like before, we will be writing the two scripts, one to fetch listing URs and store in a text file and the other to parse those links. The data will be stored in JSON format for further processing. I will be using Scraper API service for parsing purposes which makes me free from all worries blocking and rendering dynamic sites since it takes care of everything. The first script is to fetching listings of a category.…

  • Create Amazon Scraper in Python using Scraper API
    Learn how to create an Amazon scraper in python to scrape product details like price, ASIN etc

    In this post of ScrapingTheFamous, I am going o write a scraper that will scrape data from Amazon. I do not need to tell you what is Amazon. You are here because you already know about it 🙂 So, we are going to write two different scripts: one would be fetch.py that would be fetching URLs of individual listings and save in a text file. Later another script, parse.py that will have a function taking an individual listing URL, scrape data, and save in JSON format. I will be using Scraper API service for parsing purposes which makes me free from all worries blocking and rendering dynamic sites since it…

  • Getting started with Apache Avro and Python
    Learn how to create and consume Apache Avro based data for better and efficient transfer.

    In this post, I am going to talk about Apache Avro, an open-source data serialization system that is being used by tools like Spark, Kafka, and others for big data processing. What is Apache Avro According to Wikipedia: Avro is a row-oriented remote procedure call and data serialization framework developed within Apache’s Hadoop project. It uses JSON for defining data types and protocols, and serializes data in a compact binary format. Its primary use is in Apache Hadoop, where it can provide both a serialization format for persistent data, and a wire format for communication between Hadoop nodes, and from client programs to the Hadoop services. Avro uses a schema…

  • Create your first REST API in Django Rest Framework
    A step by step guide creating APIs in Django Rest Framework

    In this post, I am going to talk about Django Rest Framework or DRF. DRF is used to create RESTful APIs in Django which later could be consumed by various apps; mobile, web, desktop, etc. We will be discussing how to install DRF on your machine and then will be writing our APIs for a system. Before we discuss DRF, let’s talk a bit about REST itself. What is Rest From Wikipedia Representational state transfer (REST) is a software architectural style that defines a set of constraints to be used for creating Web services. Web services that conform to the REST architectural style, called RESTful Web services, provide interoperability between…

  • Top 6 Tips for Planning a Successful Azure Migration

    Migrating your workloads to Azure can help you leverage the benefits of cloud computing. This includes agility, scalability, lower costs, and easier management. However, the process of migration can sometimes be complicated. You have to select the proper service model for every workload and establish a migration strategy for all workloads. A well-planned migration strategy can help you make the move without impacting your business. The following tips will ensure that your Azure migration goes smoothly. Reasons for Migrating to the Cloud Cloud migration can be risky and expensive, but also rewarding. Here are some of the common drivers for moving workloads and applications to the cloud:  Reducing operating costs—the…