It has been almost 5 months since I have written a blog. I have been soo busy with my works and learning new technologies. Its Christmas and holiday time, So I taught of writing some random blog about Things I learned in 2019, The mistakes I made and My 2020 wishlists. I think this will be helpful to some young programmers to choose their career path and help them to avoid mistakes I encountered. I will also try to share some of the resources I referred to while learning these technologies.
So here is the list of things I learned in 2019,
At the beginning of 2019, I switched from a2hosting shared hosting platform to a dedicated VPS in Digitalocean. I got a very bad performance experience from a2hosting and the Digitalocean was my first choice as it was relatively cheaper when compared to other choices and has the Paypal payment option available. Setting up of VPS was much easier than I thought. This tutorial from J. Alexander Curtis really helped in the initial setup. The Digitalocean also has some awesome blogs to set up Let's encrypt and phpMyAdmin. I used Cloudflare for CDN purposes. Cloudflare is free and also provides you with a free SSL certificate and some basic layer of security. I really recommend Cloudflare if you own some website like me.
At the mid of this year, I switched to AWS EC2 micro free tier instance to explore the AWS platform. From a performance perspective, both Digitalocean Droplet and AWS EC2 give you equal performance. I don't see a huge improvement in speed. The dashboard in AWS is much more complex than Digitalocean. AWS is best if you have a much more complex application that you need to scale easily.
By the way, I have created an SSH config file to easily log in to my servers without typing credentials. You can learn more about the SSH config file from the link below.
The first client I worked with when I joined was a big e-commerce player in India with more than 5 million users. Previously I have only worked with Monolithic projects where you will not need to deal with multiple ports, switching between different environment versions, build pipelines for continuous integration, etc. Working with this client really helped me to learn Nginx reverse proxying, deployment of node.js microservices, talking with Java spring boot-microservices from the frontend (React) using Http proxy middlewares and taking Jenkin's build. By the way, I used n to switch between different node.js versions.
This year I got expertise in more advanced React concepts like HOC, Render Props, Hooks, etc. The React hooks really help to avoid some unwanted complexities like using connect HOC to connect your component with Redux. Now you can use useSelector() and useDispatch() instead of using connect when using react-redux. Also the new useParams(), useLocation() and useHistory() hooks in new react-router-dom help you to avoid withRouter HOC. The custom hooks are one of the features that attracted me to try React hooks and after used it for a while now I cannot think a going back to the old style.
Another advantage I got by working with the above mentioned big e-commerce client was time and resources. Big companies have more resources working on a feature, so you got more time to complete your work. Since these are very big projects you need to write proper unit tests so that, later own other developers don't break the feature you build. This client exposure really helped me to learn unit testing to React using Jest and Enzyme. If you got a chance to work with a big team with enough time then I really recommend you to write unit tests in your project. After all, it is not difficult to write tests.
Although previously I have done a small client project that uses MongoDB, Working with the above e-commerce client gives me more exposure in MongoDB. I have learned to write some API's by using Mongo client in Node.js, written some cronjob scripts for MongoDB to store some reports in a new collection, exposure in working with Robo3t, etc. From a DB standpoint of view, I still love SQL over NoSQL databases due to the data integrity they provide. MongoDB is a good choice for you if you don't want data integrity and need to speed up your development.
When you are working on big projects with a lot of resources adding TypeScript with React will give you some edge. But I am not a big fan of TypeScript as it added unwanted complexity to small to mid-size projects and will definitely increase your overall development time. Use TypeScript if you have enough resources and time.
You can ensure still ensure a pretty good coding standard by using Eslint and Prettier in your projects. In fact, I am a big fan of it. Also sharing with you the Eslint configuration I used below.
Although previously I have done some small projects in Node.js, This year I was lucky to work full time on a Node.js project for a Bank in the middle east. Learned about modular development pattern (Help to create microservices), proper logging, creating custom events, sequelize orm, Authentication and Authorization, scheduling, best practices and security, proper debugging using VSCode, production deployment using PM2 and nginx reverse proxy configuration, unit testing using Jest, etc.
Previously I only use basic git concepts like cloning, branching, stashes, ssh configuration, etc in my projects. My master branch is not locked and I don't do any merge requests and code reviews. When I started working with big team projects, I got familiar with merge requests, code reviewing, git rebase, git cherry-picking, reverting commits, git hooks, tagging sprint releases, etc. I also learned that micro commits are always best.
Working with a Delivery Manager with over 20 years of industrial experience has really changed my persuasion of database designing. He shows me how to design a good scalable database with proper constraints to ensure data integrity. He also shows me naming strategies and how to avoid unwanted tables by proper normalization. Previously I used ORM to setup my migrations and never used compound unique key constraints etc. Now I learn a fact that proper DB design is a major fact to the success of a project.
Although data structure and algorithms are a major part of my computer engineering syllabus, I never give a proper dedication to these subjects as I was more lean towards programming. Now I understand my mistakes. Good knowledge of data structure like trees can help you to design and develop some complex things like route matching in an efficient manner. For example, the find-my-way is an HTTP router, internally uses a highly performant Radix Tree. It is used internally by fastify and restify which gives then huge speed advantage over Express. This year I started learning some basics of data structure and some algorithms. I am looking forward to learning some advance one and leap coding next year.
One career advice is, Always give first priority in learning proper coding practices and development. The clients always need delivery over complexity and algorithms.
This year I learned the major concepts of Progressive Web Apps (PWA) and how to convert an existing application to PWA using Workbox. I also wrote a tutorial on How to easily convert your existing Laravel application into a Progressive Web App by using WorkBox. You can find the article below.
I larned major PWA concepts by purchasing Progressive Web Apps (PWA) - The Complete Guide by Maximilian Schwarzmüller. You can also find the link to that course below.
Now I will share will you some of the technical mistakes I made or more appropriately some bad technical choices I made in 2019. So here is the list,
Please note that Express is not a bad framework. It is one of the best and simpler and the most popular Node.js framework out there. The only reason why I like to give more preference to fastify or restify is their speed. They are almost twice faster than Express. For example, the find-my-way is an HTTP router, internally uses a highly performant Radix Tree. It is used internally by fastify and restify which gives then huge speed advantage over Express.
Moreover, the syntax is almost similar to Express. Another factor is most of the 3rd party libraries I used in my node.js project are not framework specific. For example, For form validation, I used Yup so that I can share my validation schema between Frontend React Formik and Backend. You can add Yup in any Node.js framework as it is not framework dependent. Similar case for ORM, logging, sending emails, etc.
Please note that Uikit 3 is a great UI framework. It follows a modern design pattern and has a very good collection of UI components. So some of the problems I personally faced are,
Here also please note that Jest is not a bad testing framework. It is one of the most advanced and fully featured unit testing frameworks out there. If you use Jest for writing unit tests in React then using it in node project is an easy switch. Moreover unlike other testing frameworks Jest is fully featured. It gives you a test runner, assertion library and mocks.
Now here is the problem I encountered. Instead of mocking the DB query I test my repositories by directly inserting data into a test database and cleaning data after each test suite runs. This helps me to make 100% sure that my DB queries work fine. But when my number of test suites increases my assertions are failing. The reason for this is unlike other testing frameworks like Mocha, Jest runs tests in parallel for better speed. So my multiple parallel DB inserts from different test suites break my assertions.
By the way, still, I prefer Jest. I am in love with its syntax...
Here are some of the list of technologies l like to learn in 2020.
"ELK" is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana. Elasticsearch is a search and analytics engine. Logstash is a server‑side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a "stash" like Elasticsearch. Kibana lets users visualize data with charts and graphs in Elasticsearch.
Apache Kafka is a distributed publish-subscribe messaging system and a robust queue that can handle a high volume of data and enables you to pass messages from one end-point to another. Kafka is one of the hottest technological stacks. It is suitable for building both offline and online message consumption.
Although I am pretty familiar with these stacks implementation through some tutorials, I have not tried it in any of my real-world projects. Currently, I am working for a banking domain so data is stored in their dedicated servers and my previous client - the E-commerce company stores data's in the database in base64 format.
Although I am pretty familiar with concepts like docker images and container, I have not used it in any of my projects. I am looking forward to learning more about it next year.
As I already told you I started learning the basics of data structure and algorithm this year. Next year I am wishing to enter into more advanced stuff like Radix Tree, Red-Black Tree etc. and some leap coding to improve my coding skills.
I think that is it. This blog has been too long and I am tired of typing. So I am concluding it. If you have any good suggestions please let me know. Also please let me know if you need to know more information about any of the above-mentioned things.