<![CDATA[Small World]]>https://codetiger.github.io/blog/https://codetiger.github.io/blog/favicon.pngSmall Worldhttps://codetiger.github.io/blog/Ghost 5.81Wed, 03 Apr 2024 13:32:06 GMT60<![CDATA[AI assisted coding (GitHub Copilot) and never going back]]>It began when the CTO at my workplace, where I'm currently employed full-time, proposed, "Let's give GitHub Copilot a try; it could cut down engineers' time by 40%." He reiterated this idea several times, and my first reaction was a mix of skepticism

]]>
https://codetiger.github.io/blog/github-copilot-ai-assisted-coding/65d202af5399d0a5df72c5b1Sun, 18 Feb 2024 15:40:41 GMT

It began when the CTO at my workplace, where I'm currently employed full-time, proposed, "Let's give GitHub Copilot a try; it could cut down engineers' time by 40%." He reiterated this idea several times, and my first reaction was a mix of skepticism 😏 and curiosity 🤔. This reaction stemmed from a past encounter with a different organisation's CTO who had proclaimed, "Coding is a thing of the past; the world has moved on to connecting components like plug and play."

Here's my personal feedback after experimenting with GitHub Copilot for a few days on my personal project

TL;DR

  1. Never code without an AI assistance (GitHub Copilot for now) from now.
  2. Does it save 40% of engineering effort? Probably much more.
  3. Does it take the job of an engineer? Hell NO.

Disclaimer

  1. The feedback is based on my personal experience while working on my weekend project.
  2. Am not a legal advisor so deal with your legal team before making decisions.
  3. It's ok to assume that am exaggerating, coz my initial reaction was the same before I tried it myself.

The bumpy road

So, I decided to take on this side project of rebuilding one of my iOS apps just for kicks. The original version was done with Objective-C and OpenGL/Metal APIs, but now I'm switching things up with SwiftUI, Swift, and SceneKit. It's my first time diving into Swift, SwiftUI, and SceneKit, and let me tell you, it's a wild ride.

This app isn't your run-of-the-mill type – it's got heavy-duty UI design and 3D rendering going on. So, yeah, I expected a pretty gnarly learning curve. I'm using Xcode on my trusty MacOS, and even though GitHub Copilot isn't officially vibing with it, there are some cool open-source extensions that make it play nice.

The type of driver, I am

I'm the kind of programmer who's always tinkering with code and building things, even if it's just for fun on my personal projects. Despite my job not requiring me to code over the past decade, I still find myself diving into it whenever I can.

Now, when it comes to programming languages, I'm a bit of a polyglot. I hop between them like a squirrel in a nut store. Remembering every syntax? Nah, that's not my style. I prefer to keep my mind free and depend on good ol' search engines or, more recently, the helpful nudges from ChatGPT. So, you could say I'm all about that dynamic, ever-learning coding life.

I struggle with learning anything related to UI design, and dancing with div alignment in HTML and HStack/VStack in SwiftUI always frustrates me because I never seem to get it right, even after trying multiple times.

The ride experience

I kicked off this project flying solo, no AI to lend a hand, just to see how it rolls. After a week of wrestling with new programming languages, syntax, and frameworks, I thought, "Why not give GitHub Copilot a spin?" So, I went on with the personal subscription and slapped on those extensions. Let the coding games begin! 🚀

First reaction

So, I fired up the same project and man, was I disappointed at first. I was half expecting fireworks—new toolbars, a flashy AI menu, maybe even a little chatbot buddy on the side. But nope, nada. My expectations totally missed the mark.

Anyway, I dove into coding like I always do, just to see what this AI fuss was about and where it was hiding. And let me tell you, I was floored. As soon as I started typing, bam! AI suggestions started popping up left and right, tailored perfectly to what I was working on. It was like the AI had been watching over my shoulder the whole time, picking up on my coding style and preferences.

But here's the real kicker: these suggestions weren't just generic boilerplate. They were spot-on, matching my coding style and design principles to a T. It was like having a supercharged coding buddy who knew exactly what I needed before I even knew I needed it.

Suddenly, UI development felt like a breeze. No more bouncing between search engines, docs, ChatGPT, and Stack Overflow. The IDE and its sneaky AI assistant had my back, doing all the heavy lifting so I could focus on what really matters—bringing my ideas to life.

The mind-blowing thing about how well the AI grasps the local context is just incredible. It's like the whole developer experience has had a major upgrade overnight. The last time I felt this kind of game-changing shift was...

  1. Back in the day, when Turbo-C++ rolled out with that amazing help document right there in the IDE, it was a game-changer. Before that, I was constantly flipping through a book for syntax references while coding. I can't even recall how much I appreciated the folks behind the IDE for bringing in that genius idea of contextual help. It was like a dream come true for every coder out there.
AI assisted coding (GitHub Copilot) and never going back
  1. And then, along came the internet community, with powerhouses like Stack Overflow and the trusty search engine combo, taking problem-solving to a whole new level. It's like they opened up a treasure trove of solutions, each one tackling the same problem but with a different twist. It was like having a global team of coding buddies at your fingertips, ready to lend a hand whenever you needed it.
AI assisted coding (GitHub Copilot) and never going back
  1. ChatGPT certainly pushed things forward, but it didn't quite hit the mark. Enter GitHub Copilot, taking the game to a whole new dimension. Check out this screenshot—suggestions for the 3rd button, complete with pre-filled code snippets based on my past code. This right here is one of those jaw-dropping moments where I just couldn't believe what I was seeing. Mind blown, for real! 🚀
AI assisted coding (GitHub Copilot) and never going back

If you thought UI development (declarative programming) was full of pattern similarities and easy for AI to handle, I had the same hunch and decided to test it out with Shader programming in 3D rendering. And guess what? Lightning struck twice—it had that same wow effect all over again. It's like finding those hidden threads that connect different realms of coding, making it a wild and eye-opening ride.

AI assisted coding (GitHub Copilot) and never going back

Destination

GitHub Copilot is definitely the next logical step in the evolution of developer experience. It's no wonder why my CTO sang its praises and hyped up its potential to change our lives. This tool is like having a coding genius right at your fingertips, ready to jump in and lend a hand whenever you need it. It's a game-changer, plain and simple.

Fellow programmers, to address the age-old question of "Am I losing my job to AI?" my answer is a resounding no. Instead, you're on the brink of becoming even better at what you do. Think of it like how power tools revolutionized carpentry—they didn't replace the carpenter; they just made their skills more efficient and effective. Similarly, AI tools like GitHub Copilot are here to enhance our capabilities, not replace us. So, embrace the change and get ready to take your coding skills to new heights!

If my GitHub Copilot experience sounds like an exaggeration to you, no hard feelings—I was skeptical too until I gave it a shot. Happy coding, and may your programming adventures be filled with pleasant surprises! 🚀

]]>
<![CDATA[Custom PCB for the gaming console based on RP2040]]>In this blog post, we delve further into the fascinating journey of crafting a retro-style game console entirely from the ground up—an extension of my previous article.

The initial version was constructed using the RPi Pico as the main board, with additional modules integrated around it. Nevertheless, for

]]>
https://codetiger.github.io/blog/my-first-attempt-on-pcb-designing-based-on-rp2040/65a4207c34d9232f723a0250Sat, 20 Jan 2024 12:02:26 GMTIn this blog post, we delve further into the fascinating journey of crafting a retro-style game console entirely from the ground up—an extension of my previous article. Custom PCB for the gaming console based on RP2040

The initial version was constructed using the RPi Pico as the main board, with additional modules integrated around it. Nevertheless, for the upcoming iteration, I aimed to elevate the intrigue further and opted to create a custom printed circuit board (PCB) specifically tailored for the console.

Learning KiCad:

After dabbling with a bunch of design tools, free and fancy ones included, I finally found my groove with KiCad. It might not be the superstar of design tools, but it's got the basics and a killer community to back it up. Jumping into open source tools is always a bit of a rollercoaster at the start, with the learning curve feeling like a mountain. Wrangling the PCB design in KiCad took me a good few months—about 10 iterations' worth—to get it just right and comfy. Shoutout to the myriad open source RP2040 boards, especially the ones from Adafruit, for teaching me the nitty-gritty of PCB design.

Custom PCB for the gaming console based on RP2040

PCB Design and Fabrication Journey:

So, as a newbie PCB designer, I was on the edge when my design was almost ready. I kept going over it for what felt like the gazillionth time, fearing something might go south. After a few months of this madness, I finally decided that my design was good to go for fabrication.

Custom PCB for the gaming console based on RP2040

Now, the real challenge kicked in – picking a fabrication vendor and choosing components based on what's available. I played around with changes for different vendors, which meant going back to the drawing board a few times. Eventually, I settled on this fab company in China that's pretty famous. They did an awesome job, and the customs charges were surprisingly easy on the wallet.

Playing with Device Drivers:

So, when the PCBs finally rolled in, my first move was checking if the device played nice in both boot mode and regular mode. To my surprise, I managed to slap on a hello world code without a hitch. A bunch of folks diving into custom PCB designs for the first time were griping about devices not getting detected or struggling to upload code. Lucky for me, everything worked like a charm, though I did goof up on the SPI pins for the SD Card.

I picked pins for both the LCD and SD Card that danced to the same SPI controller. That meant I could only use one at a time. Not a biggie for me, though, since I had no plans to give SD Card some love in the near future. Even if I change my mind, I can still resort to PIO programming to make it happen. I tinkered with Micropython and managed to juggle both the LCD and SD Card without breaking a sweat on the performance front. Since the LCD is one of those devices that's a bandwidth hog, I couldn't afford any dips in performance there.

Achieving a Fully Operational OS:

After a few more months of grappling with challenges, I triumphed in getting all the devices on the custom PCB up and running, mirroring the functionality of the previous device version without any compromises. Witnessing the games come to life on the new device brought a huge smile to my face. It's been quite a journey!

Custom PCB for the gaming console based on RP2040

Open Source:

The project remains true to its roots, with both the hardware and software being fully open source and accessible in the repository GameTiger Console source.

]]>
<![CDATA[The Leopard of silence - Street Art in Bucharest]]>Bucharest, Romania was very different than how I imagined. People were very cool, unjudgmental and very calm. Usually cities are very different as people tend to be fast and look in a hurry. The other thing I noticed is the graffiti all over the streets, which I ignored in the

]]>
https://codetiger.github.io/blog/the-leopard-of-silence/65a4207c34d9232f723a024eMon, 26 Dec 2022 08:53:48 GMT

Bucharest, Romania was very different than how I imagined. People were very cool, unjudgmental and very calm. Usually cities are very different as people tend to be fast and look in a hurry. The other thing I noticed is the graffiti all over the streets, which I ignored in the first few days. I was looking at these graffiti, as just a street vandalism, however I realised it was more of an art as soon as I saw this one in Calea Griviței.

The usual graffiti that I noticed were all just names of artists and some weird looking paintings. However, this one caught my attention as it can't be a joke. The artist had put a lot of time designing this and painting it on such a large wall. I quickly realised the entire painting had a lot of text embedded in it, so I took a picture and walked away. I did a lot of search to understand what this art has hidden beneath. Rest of the article talks about, what I was able to decode from art.

KERO-IRLO-OCU

At the very top of the painting you can see names of the artists, Kero, Irlo and Ocu. A quick search on the internet shows that they are quite famous in Romania for a previously controversial painting that was taken down immediately after it was public. This painting, to some extend looks like a revenge on taking down their previous work.

Kero in an interview says, the painting is not a revenge but based on the words: "De ce vezi tu paiul din ochiul fratelui tău și nu te uiți cu băgare de seamă la bârna din ochiul tău?" which translates to "Why do you look at the speck of sawdust in your brother’s eye and pay no attention to the plank in your own eye?".

Ocu said in an interview, "The leopard symbolizes the slow and unconscious movement of our negative side, the automatisms hidden deep in various portion of the mind. Once aware, this movement helps to observe different situations with clarity and transform them into positive parts of life. We move in the direction of self-development through self-observation, and not by judging those around us."

The leopard

The very first thing you notice is the angry leopard and the fence. There is a famous saying in Romania, "Înăuntru-i vopsit gardul și afară-i leopardul". The translation means, "Outside is a painted fence, inside is a leopard". Though the expression sounds meaningless in literal terms, Romanians use this expression to emphasize the glaring difference between what is seen and what is reality.

There is a different meaning behind this saying, which is believed to be the origin. In a circus, there is usually a huge drawing with leopards and other animals to attract the customers. The employee of circus shouts, "The fence is painted outside, the leopard is inside.". This means, "What you see outside is just a painting and there is a live beast inside for which you need to buy tickets". However, the meaning of how this expression used today is very different.

If you take a deeper look at the leopard, the rosette spots on the leopard are not random. They have continuous words that says "Acesta nu este un leopard. Acestea sunt fricile noastre. Petele noastre. Păcatele noastre. Înainte să o dai de gard, întreabă-te: ce e un leopard?" which directly translates to "This is not a leopard. These are our fears. Our spots. Our sins. Before you jump on the fence, ask yourself: what is a leopard?". You can also notice the copyright symbol at the end of the tail "© 2020".

The huge chain in the leopard's mouth is taken from a character's neck who is rich and preaches things that he does not follow for himself.

Overall, the fierce expression of the leopard conveys that the things you hide will come back and haunt you.

The spiritual leader removing the graffiti

The character who looks like a religious leader in the foreground layer, removes the graffiti on the fences while he himself wearing a cloth made of graffiti. This definitely sounds like a revenge on the previous work by the same artists being taken down due to pressure from a nearby church. This remind me of the saying "When one person makes an accusation, check to be sure he himself is not the guilty one. Sometimes it is those whose case is weak who make the most clamour.". The art beautifully portrays the same saying.

Learning

I was staring at this photo for a week, and read from various sources to write this article. Couldn't find a single place where I could decode what the artist was saying, so putting everything I learnt from this art.

What was shocking to me, is the fact that this is not just graffiti but a revolution by art.

]]>
<![CDATA[The dream of reaching near space]]>Story from 1990s:

Around the 1990s, my parents moved to a rented house in the center of our home town Rajapalayam. Thats when I first met our landlord Mr.Rajalingam Raja who was a retired businessman. He closed his grocery store business and started doing small finance for the rest

]]>
https://codetiger.github.io/blog/dream-of-reaching-near-space-using-high-altitude-balloon/65a4207c34d9232f723a0245Mon, 16 May 2022 04:04:54 GMTStory from 1990s:The dream of reaching near space

Around the 1990s, my parents moved to a rented house in the center of our home town Rajapalayam. Thats when I first met our landlord Mr.Rajalingam Raja who was a retired businessman. He closed his grocery store business and started doing small finance for the rest of his life. While, I am introducing him as my landlord, he has never been like one. He has always been a best friend, mentor and a God father to me, despite being 50 yrs elder than me.

Mr. Rajalingam Raja:

Despite his limited education qualification and exposure, he had a deep passion towards electronics. He understood electrical and electronics very well just from his passion towards technology. This was a skill that he gained by constant learning despite his age. For the first time in my life, I came across someone who had a hobby which was not related to their profession.

The dream of reaching near space
Rajalingam Raja, Rajapalayam

While, he is not alive this day to share my current hobbies and see the advancements, I always consider him a God father close to heart because of whom I am what I am today.

If I had not met him back then, I can't imagine what I would be today. I would definitely not have got the interest towards electronics which eventually transformed into a passion for software programming.

Team 2.25:

My Godfather had a very close friend with whom he shared his dreams and hobbies. Mr. Subbiah Raja is another great person who had deep passion towards photography and technology.

The dream of reaching near space

Together they were dreaming about flying a balloon filled with light weight gases and attach a small blinking light to prank their friends about a UFO. This was a top secret project which was never revealed to anyone other than we 3 as far as I know. They were dreaming about connecting a small light bulb to a battery and make it blink at a certain frequency. They would often brainstorm about this idea and had an unsolved challenge of producing the lightweight gases. The number 2 in the title comes from these 2 gentlemen with great ambitions and the 0.25 comes from a 5th grader who joined their dream club.

Unfortunately they never achieved this dream but they embedded the idea in to my head so deep that even after 3 decades the idea keeps floating in my head. I remember every conversation they had about this idea and the dreams I had after each discussion.

Attempt #1 (2008-09):

In 2008, I decided to actually try this out in a more modern way, when I came across a video online about another near space experiment. However, after a long gap in electronics, I couldn't complete that dream and gave up after burning the GPS and RF chips which were imported from UK. The year 2008 ended up bit tough at work and had to put a big pause to this dream again.

Attempt #2 (2020-21):

After more than a decade, and more recent involvement electronics hobbies with Raspberry Pi, I decided to resume this dream again. This time, I had more ideas and lot of prev experiments to follow. Decided to add a lot of new ideas on to the original dream and made more scientific goals to it then just pranking someone. I started building a Near space satellite project with electronic components commercially available for any electronic enthusiast today.

RaLiSat-1:

Decided to name it after my Godfather, in his memories. I continue with my dream this time with more knowledge and focus. It took me a year (on and off, mostly during weekends) to fully assemble, test and launch the satellite. Here is a series of blogposts from the past related to RaLiSat-1.

  1. Payload system design
  2. Payload design challenges
  3. Base station system design

Test scenarios include putting it into a freezer to test sub-zero temperatures. Of course, even with max settings my parents refrigerator could only get to -12°C, while the worst case in real-world near space can ready -60°C. I definitely under estimated this to be close to -45°C but it seems to have reached -63°C even at 14 kms altitude.

And the dashboard looked like this while the communication was lost with the payload.

The dream of reaching near space

Why did I try this?

While this is a popular hobby among electronics and near space enthusiasts in some part of the world, it is not a common thing in India. The one reason why I did attempt this, was that

The idea was running in my head for almost 3 decades and I would definitely have become crazy at some point if I didn't try.

Now, I am unable to tell exactly what will come next. Stay tuned.

]]>
<![CDATA[Building a retro style game console from scratch]]>https://codetiger.github.io/blog/building-a-retro-style-game-console-in-2022/65a4207c34d9232f723a024dSat, 12 Mar 2022 12:22:34 GMT

I have always been and will ever be, passionate about gaming. The very first thing that inspired me into electronics and computers, was the fun around playing and building games.

My most favourite hobby is programming a simple game whenever I pick up a new language or a computing platforms. To add some fun, this time, I wanted to build a small handheld gaming console from the scratch.

My family named this device "GameTiger" and the logo reflects it's name. I'll be talking about the logo more, later in the article. The device looks amateurish and I wanted to keep it that way for sometime until I can call the device complete, both in hardware and software.

Hardware:

The entire hardware is custom built and is based on Raspberry Pi Pico microcontroller. The choice of the MCU is based on its simplicity, cost and support for various tools. I know very well that with my expert level soldering skills, I'll definitely fry a few components. So I wanted it to be cheap so I don't spend too much.

  • MCU RP2040
    • 32-bit dual ARM Cortex-M0+ Microcontroller
    • 133 MHz Clock speed
    • 264 KB SRAM
    • 2 MB flash storage
    • 26 GPIO pins
  • LCD display module by Waveshare
    • Resolution: 240×320
    • Color: 262K RGB (24bit RGB888)
    • Interface: SPI
    • Driver: ST7789
    • Backlight: LED
    • Operating voltage: 3.3V/5V
  • Tactile Buttons
  • LiPo SHIM for Pico by Pimoroni
    • MCP73831 charger
    • XB6096I2S battery protector
    • Supports battery level measuring on VSYS pin
  • Witty Fox Li-Ion Battery
    • Voltage: 3.7v
    • Capacity: 1000 mAh

Wiring:

The components are based on standard interfaces and thus nothing complicated in wiring. You can feel free to use different GPIO pins based on lot of tutorials but this is what I've used and configured in the software as default.

ComponentPinPico GPIODescription
LCDVCCVSYSPower Input
GNDGNDGround
DINGP11MOSI pin of SPI, data transmitted
CLKGP10SCK pin of SPI, clock pin
CSGP9Chip selection of SPI, low active
DCGP8Data/Command control pin (High:data; Low: command)
RSTGP12Reset pin, low active
BLGP13Backlight control
ButtonsUpGPIO2Up button in the keypad
DownGPIO0Down button in the keypad
LeftGPIO1Left button in the keypad
RightGPIO3Right button in the keypad
AGPIO4A (Action) button in the keypad
BGPIO5B (Back) button in the keypad
LiPo SHIMDirectly mounted on Pico based on datasheet

Software:

Yes, you read it correct, the chip has only 264 KB RAM and that a lot less for these days. To explain the complexity in building the software, the framebuffer alone for storing the on-screen pixel details takes 153.6 KB (320 width * 240 height * 2 bytes). While the LCD display supports upto 262K colors, I decided to use only 2 bytes for each pixel to save the RAM usage. To store RGB888, the total memory needed is 230 KB which is more than 85% of the RAM size. Also, we can't do double buffering like most games do, the traditional way. Or even storing a sprite sheet of the size of the screen into the RAM is also not possible. Below are the list of modules I've built into the software.

  • Operating System Drivers
    • Display driver over SPI using DMA (Direct Memory Access)
    • Button interrupts
    • Battery management system driver
  • Framebuffer Library
    • Supports transparency
    • Direct streaming to display memory (partial/full updates)
    • Primitive shape drawing including Line, Circle, Rect and Fill Rect
    • Supports drawing images with alpha channel
    • All framebuffer operations support DMA (Direct Memory Access)
  • Sprite sheet
    • Support for sprite sheet
    • Basic tilemap support
  • Font system based on Sprite sheet
  • Menu system
    • Dynamically loading games
    • Hardware config
      • Display brightness
      • Display sleep time after inactivity
  • Filesystem
    • Support for SD card module to load game assets

The most complex part of software was the frame-buffer implementation. There are 2 modes available for the games, one of-course using the framebuffer that takes 154 kbs of the RAM and update the display memory periodically, or just stream the changes directly to the display. The other complexity is using both the cores available in the chip. Unlike CPUs, the MCU cores are bit different in the way you can use threads in your code.

Just to give you an idea on the complexity again, the splash screen that shows the tiger logo uses full frame buffer and loads the image which is another 32 kb along with a basic font image which is 12 kb. The total RAM used is around 200 kbs already, so my code had to be written very carefully on variable usage and memory allocation. Ex: Use 8bit variable type whereever possible. I can't assume int by default for anything, as it takes 32 bits.

The sample game as always is Snake game very similar to what we use to have in Nokia 1100 handsets. The Frame-buffer is well optimised to achieve a target of 30 frames per second. The Snake game achieves more than 44 FPS on a default settings without overclocking.

More to come:

I am planning to add more games to this hardware in future when I find time. Shall keep posting updates here. I am also planning to create a 3d printed case for this device to make it look more professional.

Source code:

The entire source of GameTiger Console is available in Github. Feel free to share your feedback. Also share the games or applications you would like to see on this device.

]]>
<![CDATA[Remote ePaper display using ESP32]]>I had a 7.8 inch ePaper display from Waveshare lying around for a while. This is an expensive eInk display with 1872x1404 pixel resolution that supports 4 bit grayscale values.

Objective: Building a photo frame with the content streamed from my home server built using a Raspberry Pi. The

]]>
https://codetiger.github.io/blog/remote-epaper-display-using-esp32/65a4207c34d9232f723a0249Sat, 29 Jan 2022 03:00:10 GMT

I had a 7.8 inch ePaper display from Waveshare lying around for a while. This is an expensive eInk display with 1872x1404 pixel resolution that supports 4 bit grayscale values.

Objective: Building a photo frame with the content streamed from my home server built using a Raspberry Pi. The RPi will stream family pics available in the storage and intermittently show home power consumption dashboard on screen.

Challenges: As with any project, this one had very particular challenge of keeping the hardware simple. ESP32 has Wifi support but not enough RAM to hold the frame buffer needed. The display has 2628288 pixels each needs 4 bit which makes it 1.25 MB. The MCU that is going to drive the display needs at least 1.25MB as a frame buffer. The reason for this expectation is, the display needs the data available to the driver in a sequence of commands.

Our powerful ESP32 has only 500 KB of RAM and not all is available to the program. The display is driven by IT8951 chip module. The available driver for IT8951, open sourced by the device manufacturer does not support streaming or ESP32 chip. Their code assumes the entire image buffer is available before pushing it to the driver.

After taking a look at how the code works, I had some hope that this can be achieved by rewriting the code. However, the chip and the display were going to be powered using battery and needs to be as efficient as possible. I had to rewrite most part of the buffering and remove the need to RAM.

I used a TCP connection between RPi and ESP32. My ESP32 code will listen to TCP port 8319. The RPi code is written using python and configured as a cron job. Every time the code randomly pushes an image from a given directory. The whole post processing like resizing and color conversion had to be done at RPi as this cannot be handled in the ESP32 for lack of RAM. The ESP32 code receives the sequence of bytes from RPi and directly streams it to the IT8951 module. I was able to achieve this easily and got an image displayed on screen. Please ignore the noise in the image which comes from encoding issue which I solved later.

Remote ePaper display using ESP32

After fixes the noise issue, and getting a proper image on screen, I had another challenge. It took upto 30 seconds to stream a full size image from RPi which is way too much for what we are doing. I realised the TCP overhead for each packet added-up and eventually slowed down the whole thing.

I added intermediate buffering. Our ESP32 has some RAM which cannot be ignored. So I decided to buffer the data stream which should speed up the task. Instead of reading each byte from the TCP socket, I configured the code to read a buffer of 1KB. The buffer size is configurable. After reading the buffer, TCP socket is free to receive more data and meanwhile, I can push the data to IT8951. When trying with 1KB, I was able to stream the content in less than a second which was my target.

The reason for having 1 second as a target is, the refresh rate of the display is more than a second, so I anyway have to wait for the display to get ready. Finally the streaming worked perfect and but images had some issue. If you notice the below picture, you see the text pixelated. This was definitely not how the image looked.

Remote ePaper display using ESP32

After fighting for a whole day, I figured out that I was sending the pixel data in BIG Endian formate which the device was configured to receive in LITTLE endian format. After fixing this issue, the picture looked perfect on screen. unfortunately many open-source drivers available in Github has similar issue. The issue does not show-up well when you display an image, but is very much noticeable when you display text.

Remote ePaper display using ESP32

In the picture above, I've scrapped the Grafana dashboard and streamed it to the display. I used selenium to take screenshot of the page and used the same code to push it to the display. The entire source code is available here.

]]>
<![CDATA[Home power consumption monitoring using ESP32]]>Its always fun to collect data around you and understand your needs better. When it comes to power consumption, end of the month bill gives us a pretty good view, but I wanted to make it slightly interesting. I wanted to collect power consumption every second in my house and

]]>
https://codetiger.github.io/blog/home-power-consumption-monitoring-using-esp32/65a4207c34d9232f723a024cTue, 28 Dec 2021 07:49:57 GMT

Its always fun to collect data around you and understand your needs better. When it comes to power consumption, end of the month bill gives us a pretty good view, but I wanted to make it slightly interesting. I wanted to collect power consumption every second in my house and see how the data looks like.

Components used:

  1. ESP WROOM 32 MCU Module - (Robu.in)
  2. SCT-013-030 Non-invasive AC Current Sensor Clamp Sensor - (Robu.in)
  3. 100k Ohm resistors - 2 pieces
  4. 10uF capacitor - 1 piece
  5. 3.5mm jack female connector - 1 piece

Wiring diagram:

The wiring diagram were taken mostly from this article and OpenEnergyMonitor project. The wiring diagram is same as the standard ones mentioned in these websites. I just didn't want to copy or redo the same as the other articles are already explaining these things at the best.

Home power consumption monitoring using ESP32
ESP32 Energy Monitor Circuit
Home power consumption monitoring using ESP32
ESP32 Energy Monitor Circuit

As you can clearly see, my soldering skills are not that great. :-)

Software:

With not much changes in the hardware, I did a lot of research on making the software better. Especially with ESP32 having issues in accuracy with its ADC controller, I spent a lot of time making it better.
I used the open source Emon library and added a look up table for my ESP32 ADC which gives a very good results in accurately measuring the analog input. As it is well known each ESP32 needs to be calibrated for ADC. While the recently manufactured ESP32 are calibrated at factory, I still didn't find it quite accurate in my case. So I calibrated and generated a look up table for my chip. The calibration code is available here.

The entire code is available here: ESP32 Home energy monitoring

The ESP32 sketch works as a prometheus exporter making it easy to log the data easily in timeseries. Prometheus also has been my personal favourite in terms of resource usage.

Home power consumption monitoring using ESP32

Finally the data is available in Grafana. As I already have a Raspberry Pi running in my house for various other things, I added Prometheus and Grafana for monitoring the energy consumption as well.

Final packing:

Home power consumption monitoring using ESP32
ESP32 + SCT-13-030 CT sensor for home power monitoring
]]>
<![CDATA[RaLiSat-1 Base station system design]]>For my high altitude balloon project, I am designing a portable base station system to recieve the transmission from the payload.

Hardware:

The base station is a Raspberry Pi 3B+ based system with Lora module E32-868T30D from manufacturer Ebyte. This is very much the same module that I've

]]>
https://codetiger.github.io/blog/ground-station/65a4207c34d9232f723a024aTue, 14 Dec 2021 11:22:19 GMTFor my high altitude balloon project, I am designing a portable base station system to recieve the transmission from the payload.

Hardware:

The base station is a Raspberry Pi 3B+ based system with Lora module E32-868T30D from manufacturer Ebyte. This is very much the same module that I've used in the payload for transmission. I've also added an active buzzer to sound a beep whenever the system receives location data which is designed to happen every 5 seconds. The buzzer helped a lot during the testing phase when I had to keep this constantly running in various conditions. I believe this will help during actual flight as well, as the chasing is going to be continuous.

Software:

The software system is basically simple and uses the same Lora classes from the payload source code. The objective is to just wait for data from the payload and send acknowledgement. Internally the payload sensor data is decoded and stored in InfluxDB. I am using Grafana integrated to InfluxDB, to pull a beautify dashboard for quick data visualisation.

Grafana Dashboard - High altitude balloon project - RaLiSat-1

Payload chasing plan:

The idea is to use the base station system connected with a Wifi dongle and a power-bank, so the Grafana dashboard can be accessed using a laptop. The Map shows the latests position of the payload and will help chase it during the return fall. Hoping this will work! Stay tuned for the final results.

]]>
<![CDATA[RaLiSat-1 design challenges - Payload internal temperature]]>As we know, the higher you reach in altitude, the lower the temperature and pressure is. At around 25 kms which is my target for this project, the temperature is -56°C. This is way beyond the range of operating temperature for any commercial electronic.

Generic Solution: Most projects

]]>
https://codetiger.github.io/blog/ralisat-1-payload-design-challenges/65a4207c34d9232f723a0248Tue, 14 Dec 2021 07:31:02 GMTAs we know, the higher you reach in altitude, the lower the temperature and pressure is. At around 25 kms which is my target for this project, the temperature is -56°C. This is way beyond the range of operating temperature for any commercial electronic.

Generic Solution: Most projects use a Polystyrene boxes and add a heat source like hand warmer to keep the container within the right temperature. This approach further increases the size of the payload and the weight.

Solution Used: As a design target, I wanted the payload weight to be less than 240 grams. Primary reason is to use as less helium as possible and use 350 gram balloon. I built a styrofoam cube using layered sheets with all electronic components within each layer and enclosed it. The components were fitted within small engraving in the layers and used super glue to fit the layers perfectly. Obviously the GPS ceramic antenna, environment sensor, camera and Lora antenna had to be left outside. The overall design looks like the below image.

Test Scenarios: To test the design, the primary approach I tried was, leaving the system inside a freezer (-20°C) to see the min/max CPU temperatures for upto 2 hours. My test criteria was to keep the CPU temperature between 35°C and 75°C. This is very hard to accomplish, as the Raspberry Pi Zero does not have any airflow and easily reached 75°C even when put inside the freezer. Then I make lot of performance tuning to the system to balance the CPU usage and heat produced. At a certain point, I was able to see the temperature was stable at most environmental conditions. I tested in direct sunlight, inside the freezer and room temperature, and the CPU temperature in all conditions were well within the range.

Conclusion:

Finally I was able to reach a point where, I had to never worry about the internal temperatures reaching unexpected ranges. But remember, I took around a month to stablise this. I tried to under-clock the Raspberry Pi, but gave up on that option as it reduced the heat generation drastically and the internal CPU temperatures reached unexpected low 10°C within 25 mins in the freezer. I couldn't reach a stable temperature range with enough computing power when under-clocking the RPi. I did try writing a CPU intensive bash script which will run for few seconds to increase the temperature if it went below certain range.

Finally, understanding what part of the code produces more heat and configuring how often it can run, helped solve this problem.

The GPU usage, especially the camera image capture and resizing part was the one which produced enough heat, to make it work. The entire code is available in my GitHub project RPi-Hab

]]>
<![CDATA[RaLiSat-1 Payload system design]]>To start with, I had basic goals for designing the payload system. Including basic sensors necessary for the tracking, camera to capture the beauty and the design to sustain the system at harsh climatic conditions in the sky.

Goals for the first flight:

  1. Use a single board computer of smallest
]]>
https://codetiger.github.io/blog/ralisat-1-payload-design/65a4207c34d9232f723a0247Sun, 12 Dec 2021 06:33:11 GMTTo start with, I had basic goals for designing the payload system. Including basic sensors necessary for the tracking, camera to capture the beauty and the design to sustain the system at harsh climatic conditions in the sky.

Goals for the first flight:

  1. Use a single board computer of smallest form-factor possible to record data
  2. Add a camera capable of taking pictures at 1 sec interval
  3. Add Temperature, humidity, and pressure sensors
  4. GPS module for tracking
  5. RF Module to transmit data to ground station
  6. Battery enough to power the flight time and beyond

With simple goals laid out, I started working on doing some research on each components. For my first flight, I wanted to keep things simple so I get the experience of actual difficulty in the first place. Obviously I've more ambitions on better things in future, but didn't want to keep designing things rather try something.

Flight computer design - RaLiSat-1

Raspberry Pi and Camera:

I decided to go with a Raspberry Pi Zero based payload system so I can do enough computing without my hands tied like in-case of microprocessors. Infact, RPi Zero was bit too much for my goal, however, it comes handy and I already had one from my previous projects. The main intension was to keep the design as plug-n-play as possible. I previous attempt in flight computer failed in 2008 primarily because of this reason. RPi camera was an easy option to decide as it is very compact and integration is easy.

BME680 Environmental Sensors:

SeeedStudio's BME680 sensor module looked very impressive as it supports i2c interface and had all sensors I expected in one module. One downside, the pressure and temperature sensors had an operating range of 300-1100hPa and -40~+85 respectively. With a goal of reaching 25 kms altitude, the environment is going to be at -56 degree celsius and 25 hPa which are both out of the range of the sensors. However, beyond this reading it is very hard to find a sensor that is cheap with integration options as easy as this one. So I decided to go anyway and see how it works.

GPS Module:

With no knowledge about how the pressure sensor will work beyond 300 hPa, the only other option is to use a GPS module that will work with target altitude. I chose UBlox M8N which had a max altitude of upto 50kms using the Airborne mode. Again this was another easy choice for me. The UBlox M8N GPS module integration and source code are explained in this previous article.

Lora E32-868T30D as RF module:

For tracking the flight, I wanted the GPS coordinates to be transferred to the ground station at frequent intervals. Lora was an easy choice and Ebyte E32 modules had UART interface which had lot of advatages. Integrating E32-868T30D with Raspberry Pi is explained here. The primary purpose was to send GPS co-ordinates but eventually I decided to send almost all data recorded including images through Lora module after seeing the capability. The source code and details are provided in this article. Making my program to efficiently use Lora to send large images and data took a lot of time.

Battery:

I used an old, small and compact 6000 mAh power bank from my uncle and ripped-off the case to reduce the weight. The battery was able to last upto 5 hours which was more than sufficient for the entire flight time including tracking.

The modules were put-together to work and the entire source code took around 3 months to stabilise and fine-tune. The challenges faced in the design shall be discussed in another post. Stay tuned.

]]>
<![CDATA[Aquarium automation using Raspberry Pi]]>As a childhood desire, I always wanted to try setup an aquarium at my home. The desire was always there deep inside and never sparked until my daughter had to do some school activity on aquariums. So we decided to buy a small 10 liter aquarium tank with 3 molly

]]>
https://codetiger.github.io/blog/aquarium-automation-using-raspberry-pi/65a4207c34d9232f723a0246Wed, 08 Dec 2021 14:05:49 GMTAs a childhood desire, I always wanted to try setup an aquarium at my home. The desire was always there deep inside and never sparked until my daughter had to do some school activity on aquariums. So we decided to buy a small 10 liter aquarium tank with 3 molly fishes. To our surprise, the fishes started giving birth and the desire got intensified. So we planned to setup a 150 Liter large aquarium in our house and do some lanscaping with beautiful plants.

After 6 months of hardwork, we were able to reach to the below state. Please don't judge me on the landscaping skills. Our objective was to simulate the closest natural habitate of the fishes we had.

Challenges and why automation:

If you are an aquarium enthusiast like me and have tried planted tank, the challenges are going to be very obvious. You need to maintain lot of things like good bacteria level, CO2 level plants, O2 level for fishes, overall lighting for plants, fish food, and ofcourse plants need nutritions as well. The challenges grow from there to weekly water maintanence and trimming the plants. Overall, I got bit frustrated to repeatedly do lot of these things on a daily basis. As a programmer, my brain motivated to automate it. The overall challenge is to schedule various equipements based on below requirements.

  1. High intensity plant lights need to be switched on for only a few hours during the day.
  2. Low intensity light needs to be ON for an before and after plant lights are ON. Just to simulate dawn and dusk.
  3. CO2 needs to be ON around the same time as high intensity light as the plants will only consume CO2 when there is light. The CO2 tank needs to be switched off 30 mins before the high intesity lights to save CO2.
  4. Air pump to bring up the O2 level after the CO2 is OFF. O2 is essential for the fishes and plants at night, so the pump has to work all the time other than CO2 flow.
  5. Heavy duty filter needs to work only for few hours of the day as it should be enough to clean up the dirt and fish waste. There is another filter that work round the clock which is less powerful.
  6. Water cooler to bring down and maintain the water temperature during the day time. My location is mostly very hot during the day.
  7. Did I forget about feeding the fishes? Yes, I made a small feeder using a motor and food container with a small hole.

Electronics:

The overall automation setup is very simple, as all these equipments already were connected to power and only needed manual swtching on and off. I decided to go with a Raspberry Pi 3 B+ and a relay controller module. The wiring is simple as the relay module needs to be connected to 8 GPIO pins for On/Off signal.

Relay module

The overall setup had the relay module connected to a power extension so the same setup can be reused in future. Below image shows the power extension with all exquipements connected and the cardboard box has the relay module and Raspberry Pi.

You should have lot of questions:

Why use a full blown computer while you can use a simple timer chip or micro controller for this?
Ans: I've built a Web interface which will allow my family to manually override the config if needed and will also show the current state of the equipments. For example, we wanted the high intensity lights to be switched on if someone wants to take a look at the Aquarium. The below image shows the web page to control the individual equipments manually.

On top of this, I've been already using this Raspberry Pi as a home server and local DNS server with PiHole software running. The whole source code is available in my Github Aquarium project.

Stability of the setup:

I've been running this setup for almost 3 years in a row without any issues. No wait, with all known issues fixed. And the Aquarium managed to last for 8 months without any human intervension during the Covid-19 country wide lockdown. We had to move to our native town early during the lockdown and couldn't goto the citi for 8 months. All we had was this automated setup that worked well and kept most fishes alive. The only manual work during the last 2 years, was changing water and refilling fish food once in 3 to 5 months.

I would say, this was one of the remarkable works I've every done to save lives. :-)
]]>
<![CDATA[Interfacing Ublox GPS M8N with Raspberry Pi]]>Ublox GPS M8N module support UART communication and its fairly integrate with your Raspberry Pi. Just connecting to power and Rx wires will allow you to read the GPS NMEA messages. However the module is much more capable than just sending messages in raw text format. The module supports lot

]]>
https://codetiger.github.io/blog/interfacing-ublox-gps-m8n-with-raspberry-pi/65a4207c34d9232f723a0244Wed, 08 Dec 2021 04:51:30 GMTUblox GPS M8N module support UART communication and its fairly integrate with your Raspberry Pi. Just connecting to power and Rx wires will allow you to read the GPS NMEA messages. However the module is much more capable than just sending messages in raw text format. The module supports lot of configurations and binary mode which is much easier to handle.

Wiring:

In my case, I was using Tx/Rx wires for another module so I ran out of UART ports in my Raspberry Pi Zero. As the Zero has only one UART pair and also does not have an inbuilt USB hub, I decided to use the micro USB port and hard wired a USB UART module directly to the Pi Zero board.

Ublox M8N GPS Module wiring with CP2102 and Raspberry Pi 

Code:

The GPS Class in my project imports the ublox python module which is available here. This module allows us to configure to work in binary mode and give use the function to configure frequency of various messages. You can configure dynamic model which is very essential for my project as altitude calculation is more important for my project.

#!/usr/bin/env python3

import logging, math
from ublox import *
from threading import Thread

class GPSModule(Thread):
    gps = None
    latitude = 0.0
    longitude = 0.0
    altitude = 0.0
    fix_status = 0
    satellites = 0
    healthy = True
    onHighAltitude = False

    def __init__(self, portname="/dev/ttyUSB0", timeout=2, baudrate=9600):
        logging.getLogger("HABControl")
        logging.info('Initialising GPS Module')
        try:
            self.gps = UBlox(port=portname, timeout=timeout, baudrate=baudrate)
            self.gps.set_binary()
            self.gps.configure_poll_port()
            self.gps.configure_solution_rate(rate_ms=1000)
            self.gps.set_preferred_dynamic_model(DYNAMIC_MODEL_PEDESTRIAN)
            self.gps.configure_message_rate(CLASS_NAV, MSG_NAV_POSLLH, 1)
            self.gps.configure_message_rate(CLASS_NAV, MSG_NAV_SOL, 1)

            Thread.__init__(self)
            self.healthy = True
            self.start()
        except Exception as e:
            logging.error('Unable to initialise GPS: %s' % str(e), exc_info=True)
            self.gps = None
            self.healthy = False

    def run(self):
        while self.healthy:
            self.readData()
            time.sleep(1.0)

    def checkPressure(self, pressure):
        alt = 0.0
        if pressure is not 0:
            alt = 44330.0 * (1.0 - math.pow(pressure / 1013.25, 0.1903))
        self.checkAltitude(alt)

    def checkAltitude(self, altitude):
        if altitude is not 0:
            if altitude > 9000 and not self.onHighAltitude:
                self.onHighAltitude = True
                self.gps.set_preferred_dynamic_model(DYNAMIC_MODEL_AIRBORNE1G)
            if self.onHighAltitude and altitude < 8000:
                self.onHighAltitude = False
                self.gps.set_preferred_dynamic_model(DYNAMIC_MODEL_PEDESTRIAN)

    def readData(self):
        try:
            msg = self.gps.receive_message()
    
            if msg is not None:
                logging.debug(msg)
                if msg.name() == "NAV_SOL":
                    msg.unpack()
                    self.satellites = msg.numSV
                    self.fix_status = msg.gpsFix
                elif msg.name() == "NAV_POSLLH":
                    msg.unpack()
                    self.latitude = msg.Latitude * 1e-7
                    self.longitude = msg.Longitude * 1e-7
                    self.altitude = msg.hMSL / 1000.0

                    if self.altitude < 0.0:
                        self.altitude = 0.0
        except Exception as e:
            logging.error("Unable to read from GPS Chip - %s" % str(e), exc_info=True)
            self.healthy = False

    def close(self):
        logging.info("Closing GPS Module object")
        self.healthy = False
        self.gps.close()
        self.gps = None

The python class has been tested for various scenarios for many days continuously. Even the model switching works fine realtime and does not need reset. Please feel free to modify or improve the class for your use case. The entire project source is available at my github project.

]]>
<![CDATA[Sending large data like images over Lora]]>Got obsessed with the Lora module E32-868T30D after playing with it for couple of months now. To be honest, this seemed to me like a life saver for hobbyists to start on RF. My previous attempts failed with I had to add RF to my projects. This time it was

]]>
https://codetiger.github.io/blog/sending-large-data-like-images-over-lora/65a4207c34d9232f723a0243Mon, 06 Dec 2021 13:13:23 GMTGot obsessed with the Lora module E32-868T30D after playing with it for couple of months now. To be honest, this seemed to me like a life saver for hobbyists to start on RF. My previous attempts failed with I had to add RF to my projects. This time it was mostly a plug and play with simple configurations.

Speed:

I read everywhere saying Lora is for smaller data packets and are not meant for larger data like images. There were some calculations showing Lora would take hours or days to transfer a single image. These are not fully true and are mostly limited to certain providers. In my project I needed a P2P data transfer and it worked just fine.

Initially I started with transfering 40 bytes of sensor data every 5 seconds which worked just fine. After analysing the logs, I found that the Chip completes the air transmission in few milliseconds and just waits for more data. So I decided to push more data by breaking an image into chunks and rebuilding it on the receiver size.

An image of size 38KB takes me around 5 mins to transfer with enough delay inbetween to avoid the buffer overflows. There is no easy way to calculate the exact delay needed for each chunk to get transfered, or depending on AUX alone as well did not help my case much. So I've implemented a hybrid aproach that worked for me.

Acknowledgement:

In my implementation, the Lora class accepts file size of upto few megabytes and internally splits and sends it in chunks. The sender ensures if the chunk was received successfully by marking the chunk after getting an acknowledgement. The reason why AUX alone didn't work for me was, both the transceivers had to send and receive messages and the module didn't support duplex transmission. So a predefined interval of delay helped in my case. Otherwise, the 38KB image can be transferred in less than a minute.

Implementation:

The source code is available in my Github project. The Lora class uses SQLite for keeping track of chunks and acknowledgements. This class is the same as explained in my previous post where I've explained the basics integrating E32-868T30D with Raspberry Pi

Using this implementation, I've run the project for couple days in a row and there was no issue in buffer overflow. So this code is tested well but feel free to improve if you can find issues.

]]>
<![CDATA[Connecting Lora E32 module with Raspberry Pi]]>https://codetiger.github.io/blog/transfer-images-over-lora-e32-module/65a4207c34d9232f723a0242Wed, 01 Dec 2021 09:42:26 GMTRecently, I've been reading about Lora and how RF industry has changed in the last 5 years. Especially for electronics hobbyists like me who are self taught and have very limited knowledge in RF communication domain, these new Lora modules have made life easier. There is no introduction needed for these modules as the entire community is talking about, so I straightaway jump into the subject.

For my recent hobby project (shall be explained in future posts), I need a long range communications module between 2 Raspberry Pi based devices. The approximate distance is roughly 30 kms within line of sight. The 30kms and line of sight should be enough to guess what am planning to do. However, I'll still keep that for another post. The easiest way to achieve long range communication today without the need for licenses, is using a Lora module. As it already supports ISM band and modules are already available for various bands depending on the country.

Regulations:

In India, the ISM band range of frequency 865 - 867 MHz is delicensed and allocated for general purpose use. The regulatory documents say, we are allowed to use a max of 1w transmission power within this range. The standard Lora modules are pre-configured for various global ISM bands like 433MHz, 868MHz and 915MHz. Now, if you notice, none of these frequencies are within the ISM band in India. Closest is 868MHz however, it is outside of the specified range.

Thanks to the operating frequency range of the Lora modules, each model supports a range of frequencies meant to support a range of channels. The device I chose for my project is E32-868T30D which supports frequency range from 862 MHz to 893 MHz. I was bit confused as the regulations did not explicitly mention whether the 865 and 867 frequencies are inclusive in the allowed range. So just to be on the safer side, I configured by devices to communicate in 866 MHz which is well within the range.

Lora E32-868T30D:

In fact, I did a lot of research on various modules and finally chose this device just  because of the 1w transmission power support. The actual advantage of this device comes from the support for UART communication. You can plug 2 devices into a computer with USB-UART adaptors and start  configuring and testing the device. I started using CoolTerm app on my Mac for debugging and understanding the behaviour of this device. The manufacturer EByte has give a beautiful documentation for this module. It actually didn't help me much in the beginning while I was trying to understand the behaviour of the device. Later once you get the hang of it, the document comes handy as reference.

The device has M0 and M1 pins for configuring operating modes (Normal, Wake-up, Power svaing and sleep). Many articles say, there pins can be ignored, but neirhter ignoring nor connecting directly to ground worked for me. Only thing that worked for me is connecting them to GPIO pins (17 and 27) and programatically setting the high or low for different modes. To configure the settings using commands, you need to first set the device to Sleep mode. Then in normal mode, the device uses the persisted config. So I decided to use the GPIO pins to configure on program startup so even if I swap the devices, the new settings will take effect.

Note: You can configure and persist the setting once using a computer, and then use the UART to just send data directly. This also did work as expected.

The device has another AUX pin which becomes very important when sending large data continuously. For sending data at an interval greater than the transmission time (usually a few 10 milliseconds), you don't need to worry about this pin. The pin tells us if the data written previously is being ready by the internal transmitter module. So you need to wait until the pin signals you that it is ready for more data.

Connecting to Raspberry Pi:

For my project, I chose Raspberry Pi Zero W for various obvious reasons. They are small enough and computers running Linux rather than MCUs. Below wiring diagram explains the overall connections used.

Remember to connect the Tx pin of RPi to Rx pin of Lora module and the Rx pin of RPi to Tx pin of Lora module. For most people this sounds obvious but I initially had lot of doubts on the Lora module pin naming.

Source code:

The entire source code is available as part of my Gihub project. Below is the reusable Lora Class in Python.

#!/usr/bin/env python3

import serial
import logging, time
from datetime import datetime
import sqlite3
from struct import *
from threading import Thread
import RPi.GPIO as GPIO

AUX_PIN = 18
M0_PIN = 17
M1_PIN = 27

MODE_NORMAL = 0
MODE_WAKEUP = 1
MODE_POWER_SAVING = 2
MODE_SLEEP = 3

MAX_PACKET_SIZE = 58

class LoraModule(Thread):
    ser = None
    dbConn = None
    delayAfterTransmit = 1.5
    lastTransmitTime = None
    addressHigh = 0x0
    addressLow = 0x0
    port = ""
    healthy = True

    def __init__(self, port="/dev/serial0", addressHigh=0xbc, addressLow=0x01, dataTimer=True, delay=1.5):
        logging.getLogger("HABControl")
        logging.info('Initialising Lora Module')
        GPIO.setmode(GPIO.BCM)
        GPIO.setup(AUX_PIN, GPIO.IN)
        GPIO.setup(M0_PIN, GPIO.OUT)
        GPIO.setup(M1_PIN, GPIO.OUT)
        self.delayAfterTransmit = delay
        self.lastTransmitTime = datetime.now()
        self.addressHigh = addressHigh
        self.addressLow = addressLow
        self.port = port

        self.setupPort()

        if dataTimer:
            self.dbConn = sqlite3.connect('data.db', detect_types=sqlite3.PARSE_DECLTYPES, check_same_thread=False)
            self.dbConn.execute("CREATE TABLE IF NOT EXISTS habdata(id INTEGER PRIMARY KEY, data BLOB NOT NULL, chunked INT DEFAULT 0 NOT NULL, created timestamp NOT NULL, ack INT DEFAULT 0 NOT NULL, lasttry timestamp NOT NULL);")

            Thread.__init__(self)
            self.healthy = True
            self.start()

    def setupPort(self):
        self.setMode(MODE_SLEEP)
        try:
            self.ser = serial.Serial(self.port, 9600, timeout=1, bytesize=serial.EIGHTBITS, parity=serial.PARITY_NONE, stopbits=serial.STOPBITS_ONE)
        except Exception as e:
            logging.error("Could not Open Lora Port - %s" % str(e))
            self.ser = None

        self.resetLoraModule(False)
        self.ser.baudrate = 115200

    def resetLoraModule(self, hard=False):
        self.setMode(MODE_SLEEP)
        time.sleep(0.2)

        if hard:
            packet = bytes([0xc4, 0xc4, 0xc4])
            logging.info("Reset Lora Module Data: %s" % (packet.hex()))
            self.ser.write(packet)
            time.sleep(0.2)

        packet = bytes([0xc0, self.addressHigh, self.addressLow, 0x3d, 0x04, 0xc4])
        logging.info("Sending Config Packet Size: %d Data: %s" % (len(packet), packet.hex()))
        self.ser.write(packet)
        time.sleep(0.1)
        res = self.waitForData(6)
        if res is not None:
            logging.info("Config confirmation: %s" % (res.hex()))
        else:
            logging.info("Config confirmation timeout")
        time.sleep(0.1)

        self.setMode(MODE_NORMAL)
        time.sleep(0.1)

    def setMode(self, mode):
        if mode == MODE_NORMAL:
            logging.info("Setting Lora for Normal Mode")
            GPIO.output(M0_PIN, GPIO.LOW)
            GPIO.output(M1_PIN, GPIO.LOW)
        elif mode == MODE_WAKEUP:
            logging.info("Setting Lora for Wakeup Mode")
            GPIO.output(M0_PIN, GPIO.HIGH)
            GPIO.output(M1_PIN, GPIO.LOW)
        elif mode == MODE_POWER_SAVING:
            logging.info("Setting Lora for Power Saving Mode")
            GPIO.output(M0_PIN, GPIO.LOW)
            GPIO.output(M1_PIN, GPIO.HIGH)
        elif mode == MODE_SLEEP:
            logging.info("Setting Lora for Sleep Mode")
            GPIO.output(M0_PIN, GPIO.HIGH)
            GPIO.output(M1_PIN, GPIO.HIGH)

    def run(self):
        while self.healthy:
            duration = datetime.now() - self.lastTransmitTime
            secondsFromLastTransmit = duration.total_seconds()
            while not GPIO.input(AUX_PIN):
                time.sleep(0.025)
                duration = datetime.now() - self.lastTransmitTime
                secondsFromLastTransmit = duration.total_seconds()
                if secondsFromLastTransmit > 25:
                    self.healthy = False
            time.sleep(0.025)

            try:
                if self.ser.in_waiting > 0:
                    self.recieveThread()
                elif secondsFromLastTransmit > self.delayAfterTransmit:
                    self.transmitThread()
            except Exception as e:
                logging.error("Error in Lora module - %s" % str(e), exc_info=True)
                self.healthy = False

    def transmit(self, data):
        try:
            logging.info("Sending Packet Size: %d Data: %s" % (len(data), data.hex()))
            self.ser.write(data)
            self.ser.flush()
            self.lastTransmitTime = datetime.now()
        except Exception as e:
            logging.error("Could not send data to Lora Port - %s" % str(e), exc_info=True)
            self.healthy = False

    def transmitThread(self):
        try:
            packet = bytearray()
            packet.append(0xbc)
            packet.append(0x02)
            packet.append(0x04)
            rows = self.dbConn.execute("SELECT * FROM habdata WHERE ack = 0 and lasttry < Datetime('now', '-10 seconds') ORDER BY chunked ASC, created DESC LIMIT 5").fetchall()
            for row in rows:
                if len(packet) + len(row[1]) <= MAX_PACKET_SIZE:
                    packet.append(0xda)
                    packet.append((int(row[0]) & 0xff00) >> 8) # higher byte of id
                    packet.append(int(row[0]) & 0xff) # lower byte of id
                    size = int(len(row[1])) & 0xff
                    size *= (-1 if row[2] else 1)
                    size = size.to_bytes(1, byteorder='big', signed=True)[0]
                    packet.append(size) # size of data
                    packet.extend(row[1]) # data
                    self.dbConn.execute("UPDATE habdata SET lasttry = datetime('now') WHERE id = ?", [row[0]])
                else:
                    break

            if len(packet) > 3:
                self.transmit(packet)
        except Exception as e:
            logging.error("Could not send data to Lora - %s" % str(e), exc_info=True)
            self.healthy = False

    def waitForData(self, length, timeout=10):
        callTime = datetime.now()
        while self.ser.in_waiting < length:
            time.sleep(0.025)
            duration = datetime.now() - callTime
            secondsFromCallTime = duration.total_seconds()
            if secondsFromCallTime > timeout:
                return None

        data = self.ser.read(length)
        return data

    def recieveThread(self):
        if self.ser.in_waiting >= 3:
            try:
                data = self.ser.read(3)
                if len(data) == 3 and data[0] == 0xac:
                    high = int(data[1])
                    low = int(data[2])
                    dataid = (high << 8) | low
                    logging.info("Recieved ACK for %d" % (dataid))
                    self.dbConn.execute("UPDATE habdata SET ack = 1 WHERE id = ?", [dataid])
            except Exception as e:
                logging.error("Could not update ack to SQLite - %s" % str(e), exc_info=True)

    def sendData(self, data):
        CHUNK_SIZE = MAX_PACKET_SIZE - 8 # CallSign (1 byte) Dataid (2 bytes), Size (1 byte), Chunk index (2 byte), Total Chunks (2 byte)

        try:
            isChunked = len(data) > CHUNK_SIZE
            totalChunks = int(len(data) / CHUNK_SIZE) + 1
            if totalChunks > 255 * 255:
                logging.error("Unable to send file, check file size")
                return

            if isChunked:
                logging.debug("Data: Chunked %d, totalChunks %d" % (isChunked, totalChunks))
                for i in range(0, totalChunks):
                    dt = data[i*CHUNK_SIZE:(i+1)*CHUNK_SIZE]
                    packet = bytearray()
                    indexBytes = i.to_bytes(2, byteorder='big', signed=False)
                    packet.append(indexBytes[0])
                    packet.append(indexBytes[1])
                    totalChunksBytes = totalChunks.to_bytes(2, byteorder='big', signed=False)
                    packet.append(totalChunksBytes[0])
                    packet.append(totalChunksBytes[1])
                    packet.extend(dt)
                    logging.debug("Data added to Queue: %s", packet.hex())
                    self.dbConn.execute("INSERT INTO habdata(data, chunked, created, lasttry) VALUES (?, 1, datetime('now'), datetime('now'));", [sqlite3.Binary(packet)])
            else:
                logging.debug("Data added to Queue: %s", data.hex())
                self.dbConn.execute("INSERT INTO habdata(data, created, lasttry) VALUES (?, datetime('now'), datetime('now'));", [sqlite3.Binary(data)])
        except Exception as e:
            logging.error("Could not insert to SQLite - %s" % str(e), exc_info=True)

    def hasChunkData(self):
        try:
            row = self.dbConn.execute("SELECT COUNT(*) FROM habdata WHERE ack = 0 and chunked = 1").fetchone()
            if row and row[0] > 0:
                logging.debug("Chunk pending transmit: %d" % (row[0]))
                return True
            else:
                self.dbConn.execute("DELETE FROM habdata WHERE ack = 1 and chunked = 1")
                return False
        except Exception as e:
            logging.error("Could not read from SQLite - %s" % str(e), exc_info=True)
            return False

    def close(self):
        logging.info("Closing Lora Module object")
        self.healthy = False
        self.ser.close()
        self.ser = None
        self.dbConn.close()
        self.dbConn = None
        GPIO.cleanup()

Feel free to remove need for SQLite DB usage if not needed. The DB approach was introduced for sending large data and waiting for acknoledgement. This shall be covered in detail in another post.

]]>
<![CDATA[Why blog (again)?]]>Tried a few times before and gave up on blogging each time! Even micro blogging didn't work well for me. After all the failure attempts, this time trying something different to see if I can do it.

Failure does not mean the end, at least until you fail
]]>
https://codetiger.github.io/blog/why-blog/65a4207c34d9232f723a0241Sat, 27 Nov 2021 04:39:23 GMTTried a few times before and gave up on blogging each time! Even micro blogging didn't work well for me. After all the failure attempts, this time trying something different to see if I can do it.

Failure does not mean the end, at least until you fail to use the try again option!

Goal

  • Keep a log of the hobby projects for future reference and memories
  • Try something that I always thought I am not good at
  • Assuming writing helps me keep focus on things! Prove it!
  • Consume less and produce more! (Keep away from mindless endless scrolling and start being productive)
  • Post random topics and pictures like this one
Small world - Live and let live
]]>