• Develop your first web application in Django 1.10 – Part 2

    First of all I am sorry for taking so much time to make a new post due to home shifting and travelling. In previous post you learnt how you can integrate HTML in Django application. Now we will go a step further and see how you can create a layout and how to extend it with inner HTML pages. Creating Site Layout Earlier I had just dumped the HTML of home page in master.html and then used extends to call it in inner home page. Now it’s time to make the layout file available for all other pages. Almost every website these days follow a certain theme; it consists of a header, footer,…

  • Develop your first web application in Django 1.10 – Part 1

    In part 0 I discussed initial installation and configuration. Now it’s time to get into the code. The things I am going to do in this post are: Understanding the concept of Project and Apps in Django world. Creation of App. Using templates to create home page. Projects vs Apps Django offers a very useful modular approach of creating web applications. Unlike other frameworks like PHP Laravel or Rails, Django let you create multiple apps under a project. This idea might look alien to those coming from the background of other frameworks where a project == app and you need to rely on routes etc to divide functionality. Let me take…

  • Develop your first web application in Django 1.10 – Part 0

      Django is a very popular and powerful web framework to make small to large scale web application in Python language. In this series I am going to make a simple issue tracking or bug tracking system in Django 1.10. OhBugz The application I am going to make is named as OhBugz. It is a simple issue tracking application for a single user which helps to track project related issues. It is not going to make JIRA guys sleepless but good enough to learn how to make a web application in Django. Some of the initial screens are:   Installation Since Django is a Python framework so it’s quite obvious…

  • Write your first web crawler in Python Scrapy

    The scraping series will not get completed without discussing Scrapy. In this post I am going to write a web crawler that will scrape data from OLX’s Electronics & Appliances’ items. Before I get into the code, how about having a brief intro of Scrapy itself? What is Scrapy? From Wikipedia: Scrapy (/ˈskreɪpi/ skray-pee)[1] is a free and open source web crawling framework, written in Python. Originally designed for web scraping, it can also be used to extract data using APIs or as a general purpose web crawler.[2] It is currently maintained by Scrapinghub Ltd., a web scraping development and services company.   A web crawling framework which has done all…

  • How to integrate Quickbooks online with Slack through Zapier APIs

    A few days back I was contacted by someone who wanted to get notified on Slack about new invoices created in Quickbooks online. In this post I am going to share how one can harness the power of Zapier to get notified on Slack about changes in Quickbooks. The work is done in Python. I am going to use sample invoices that I made on Quickbooks online. Before I move further, let me tell you about Quickbooks, Slack and Zapier. What is Quickbooks? Quickbooks is an online accounting software for small businesses which provides features like Invoice Management, Payroll Management, Inventory and a few others. Since it’s a SAAS based…

  • Income streams for developers

    5 Ways Developers Can Have Multiple Income Streams

    A couple of weeks back I was having a discussion with a former colleague who was upset due to no job and was not having some luck to find some freelance gig either. Another developer also had similar situation and despite of knowing something he was not able to get work. I have been working independently for 5 years. Prior to that I was engaged in typical 9-5 jobs. When you work solo, you should expect some tough time unless and until you have multiple revenue streams. What I learnt so far that developers are not good at selling themselves and they believe that they only way to earn money…

  • Year 2017 : Keep Reading, Keep Writing and Keep Learning

    The year 2016 is almost over. It’s time to take a break from typical technical posts and talks about what was done and what to be done. 2016 was the year I got involved in many new things, new in a sense that I had left them long time back and resumed again. I resumed blogging this year and 2016 is the year I was more active than yesteryears. I also started reading books, though I was not good at it but at least better than past. Here’s the list of books I read in 2016 plus a couple of Urdu books are not enlisted on Goodreads. I am targeting…

  • Write a Gmail autoresponder by using Python Selenium

    In earlier posts(here and here) I discuss how to use Python requests and beautifulsoup library to access and scrape a website. This time I am going to make a simple Gmail Autoresponder that responds to a certain mail. Before I discuss how to do it, a few words about Selenium and why is it going to make our life easier. Advantages of Selenium What one is going to achieve with Selenium by not opting for a lightweight solution based on Python requests and beautifulsoup? Selenium actually automates browser activities by simulating clicks and other events and makes easier to access information that is accessible after executing Javascript on page. Since it’s automating…

  • How to speed up your python web scraper by using multiprocessing

    In earlier posts, here and here I discussed how to write a scraper and make it secure and foolproof. These things are good to implement but not good enough to make it fast and efficient. In this post, I am going to show how a change of a few lines of code can speed up your web scraper by X times. Keep reading! If you remember the post, I scraped the detail page of OLX. Now, usually, you end up to this page after going thru the listing of such entries. First, I will make a script without multiprocessing, we will see why is it not good and then a scraper…

  • How to optimize website speed and performance by using free tools

    Developing a website is not enough these days. Even developing an “awesome looking” website is not only going to serve your purpose unless it loads faster and rank better on Google. In an earlier post I discussed how you can use free SEO Tools to rank better on Google but Google’s core search algorithm has been evolving continuously and now they are also taking site speed into consideration. For that purpose they are relying on their own tool called Page Speed which helps you to figure out what Google thinks about your website. The goal The goal is: how a developer, a front-end developer/designer or.. even a backend developer like me…