My Path from Windows to Linux as My Primary Operating System

My interest in Linux was not due to problems with Windows, but rather due to ordinary human curiosity. I had long wanted to understand how the system that many call "free" and "flexible" works. I already had some experience in administering Debian-based servers, so the idea of trying Linux as the main system on a laptop had been in my head for a long time. I wanted to understand more deeply and understand how realistic it is to use Linux in everyday work, programming and creativity.

First Steps: Manjaro

The first distribution I installed was Manjaro. It seemed convenient for a beginner, but it is based on Arch Linux. And Arch requires careful attention and experience. At that time, I was not ready for such depth: too many subtleties, too many nuances. At some point, due to my inexperience, the system simply stopped starting. Of course, it wasn't the distribution's fault, but my own, but that was the signal that the Arch approach wasn't close to me yet. I wanted something more stable and familiar, closer to what I had already encountered on servers.

Ubuntu and Kubuntu: Trials of the Pen

The next step was getting to know Ubuntu. It is popular, it is praised for its convenience, it has a huge community. But I was confused by the fact that a large company is behind the project. This is not criticism, I just had a feeling that such a system may have priorities that do not always coincide with the interests of the user. I wanted more transparency and independence. Ubuntu in this regard seemed too "mass" and corporate, and I decided to move on.

I tried Kubuntu more for the experiment. I wanted to see how another environment would behave and feel how the system changes depending on the shell. The experience was interesting, but in the end I realized: I need something fundamental, not derivative. I was looking for a foundation, stability and confidence that the system will not let me down.

Debian: Back to the Roots

That's how I came to Debian. It was a conscious choice. I knew that most popular distributions are derivatives of Debian. But I wanted to try the "original". Debian has always been considered the standard of stability and predictability. And the first days of work confirmed: this is exactly what I was looking for. Everything was logical, consistent, honest. There are no unnecessary processes, no feeling that someone is watching me or making decisions for me. I manage the system, and not vice versa.

The installation went smoothly, without surprises. And when I first saw the Debian desktop, I felt that very confidence: the system will work until I break it myself. This was important to me. I wanted not just to use the computer, but to understand what it does and why. Having tried different desktop shells, I decided to stop at the standard Gnome, which is installed by default.

Windows as a background

Windows all this time remained (and remains to this day as the original operating system that was pre-installed on the computer) on a separate SSD. It was needed mainly for games that are not designed for Linux and some other applications that I did not know analogues of before. Yes, once the system was updated from the tenth version to the eleventh almost imperceptibly for me, and this was a complete surprise and a slight hint that Windows and Linux are moving in different directions. Perhaps I myself did not notice something, but nevertheless an unpleasant aftertaste remained. Windows is good for the mass user and games, but I need a system for work, creativity and learning new technologies. Here Debian turned out to be indispensable.

Attitude to large companies

Over time, I realized that I feel more comfortable minimizing my dependence on large companies. I don’t completely refuse their products — many of them are really convenient and useful. But when it comes to data or privacy, I feel calmer when I control everything myself. That’s why I didn’t stop at Ubuntu and chose Debian. This doesn’t mean that Ubuntu is bad — Debian is just closer in spirit and approach.

The same goes for devices. I used to use an iPhone, but over time I gave it up. Now I only have an iPad, which I use exclusively for work. I realized that it’s easier and calmer for me when the system doesn’t try to lead me along a predetermined path. That’s why Debian became a natural choice for me.

My own mail server

Perhaps the most significant step towards independence was the deployment of my own mail server. I didn't want all my correspondence stored in Google or other large providers. When you administer servers, you understand how easy it is to have access to someone else's mail if it is not stored by you. So I decided to set up my own server with Postfix and Dovecot, add SPF, DKIM and DMARC. It was not easy and took a lot of time, but now I am confident: my mail works the way I want it to and is stored where I decided. This feeling of control is very valuable.

Google and search

Of course, it is difficult to completely give up Google. Search remains the most convenient tool, and I use it. But I try to minimize interaction and use alternatives where possible. It is not a question of total refusal, but rather a balance. I want to control key data and trust the system in which I work. Debian gives me this feeling — and that's what makes it my first choice.

When I first installed Debian, I had a simple question: how realistic is it to use Linux in my daily work? Experimenting out of curiosity is one thing, but working every day, programming, creating, new projects is something else entirely. But I quickly realized that Linux is even better for this than Windows. And the longer I work in it, the stronger this feeling becomes.

Programming under Linux

Programming is the first reason I seriously started using Linux. Installing the necessary tools takes a few minutes. Python, PHP, C, MySQL — everything is available directly from the repositories. If you want, install a ready-made package, or compile it from source. No unnecessary fuss, no dubious sites with installation files. Everything is at hand, and everything works right away.

I work with different languages and development environments. PyCharm, Visual Studio Code, even the classic IDLE for Python — everything runs fine in Debian. At the same time, the system itself remains lightweight, without dozens of processes in the background. This creates the feeling that Linux is truly made for development. When you write code, nothing distracts or slows down. And the terminal becomes a universal tool for launching servers, compiling, debugging, and testing.

# Installing Python and pip
sudo apt install python3 python3-pip

# Quickly launching a local web server
python3 -m http.server 8080

# Starting a MySQL server
sudo apt install mariadb-server

# Checking the gcc compiler version
gcc --version

When you work in Linux, you understand that development is not only about a code editor. It is a whole ecosystem of tools that fit together perfectly. And most importantly: you can customize everything to suit yourself.

3D modeling

Over time, I started doing 3D modeling. Blender on Linux works as if it was born in this system. No strange crashes (they happen, but rarely), no excessive load on the hardware. Even when the scenes become heavy, the system remains responsive. In Windows, it was different for me: you launch Blender, and after a couple of hours the fans howl, the computer noticeably heats up, and the system starts to slow down. In Debian, everything is much calmer. The system behaves predictably, and this gives confidence.

For 3D, you need more than just Blender. You often have to use additional utilities to convert models, textures, or work with video. Here, Linux shows its strength again. ffmpeg, imagemagick and dozens of other console tools allow you to do everything quickly and without unnecessary graphical interfaces.

# Converting video for preview
ffmpeg -i render.mp4 -vf scale=640:-1 preview.mp4

# Converting images to another format
mogrify -format jpg *.png

# Extracting a sequence of frames from animation
ffmpeg -i animation.mp4 frame_%03d.png

Game development

Another area where Linux opened up for me is game development. I tried Godot and, after some unpleasant events, its fork Redot, and they work without problems under Linux. Installation is simple - download and use, no complicated installers or library conflicts. Everything runs out of the box. And thanks to the fact that the system is not overloaded with unnecessary processes, even resource-intensive projects work stably. I tried to run Unreal Engine 5 on Linux and I succeeded, but I had to abandon this idea due to the lack of a powerful video card, which Unreal requires already at the development stage. I will also remind you of my dislike for large corporations.

Game development is not only an engine. You often have to work with graphics, sound, network services. And here again the terminal comes to the rescue. Everything you need can be done with one command. For example, convert sound to the required format or quickly check the availability of the game server.

# Converting wav to ogg for the game
ffmpeg -i sound.wav -c:a libvorbis sound.ogg

# Checking the availability of the game server
ping -c 4 mygame-server.com

# Quickly launching a local test server
python3 -m http.server 9000

Hotkeys

Hotkeys deserve a special mention. I configured the launch of all key programs so that they open instantly with one touch. PyCharm, Blender, Redot, terminal, Telegram — everything starts with a key combination. This speeds up the work incredibly. No need to look for shortcuts, no need to dig through the menu. One gesture — and the program is ready for use. In Windows, third-party programs or crutches are needed for this (of course, no one has canceled the search either). In Linux, this is a built-in feature.

Laptop performance

My Acer Nitro 5 behaves differently under Linux than under Windows. When I worked in the ten or eleventh version, the laptop heated up even when idle, the coolers worked without stopping. This was annoying. Under Debian, the situation is the opposite: the system remains light and calm. Even when I work in Blender or Redot, the computer remains quiet and predictable. There is no constant feeling that something is “eating” resources in the background. Everything is under control.

This is important not only for comfort, but also for productivity. When the system is not distracting, you can focus on work. I began to notice that I work longer and more calmly. No need to interrupt to figure out what process has loaded the processor again or why the fan suddenly started making noise. Debian gives stability and confidence.

If we talk about what makes Linux unique, the first thing that comes to mind is the terminal. It is what turns the system from just a set of programs into a tool that can be customized to the smallest detail. Over time, I realized that in Linux, almost any task can be automated. Even what seemed impossible in Windows or required complex third-party programs can be solved here with a few lines in bash.

Cron - an indispensable assistant

Cron has become my faithful companion. This built-in task scheduler allows you to execute any commands on a schedule. No need for complex utilities with graphical interfaces - just edit one file, and the system will do the job itself, accurately and reliably.

# System update every Sunday at 5am
0 5 * * 0 apt update && apt upgrade -y

# Project backup every day at 3am
0 3 * * * tar -czf /backup/projects_$(date +\%F).tar.gz /home/user/projects

# Cleaning temporary files every three days
0 1 */3 * * rm -rf /tmp/*

# Checking site availability every 10 minutes
*/10 * * * * ping -c 3 worktricks.de || echo "Site unavailable!" | mail -s "ALERT" admin@mydomain.com

What used to take hours is now done by itself. Once I set up cron, I can forget about these tasks — the system will do everything for me.

Bash scripts: automation of everyday

Bash has become a universal tool for me. At first it seemed complicated, but over time I realized that every line of the script saves me time. From backups to uploading files to the server — everything can be automated.

#!/bin/bash
# Automatic upload of new photos to the server
inotifywait -m /home/user/photos -e create |
while read path action file; do
rsync -av "$path$file" user@server:/var/www/html/photos/
done

This simple script monitors a folder and immediately uploads new files to the server. On Windows, you would have to look for third-party software for this, but here everything is solved in a couple of lines.

ffmpeg — the magic of working with media

When I first got acquainted with ffmpeg, I realized how powerful console tools are. In Windows, I used bulky programs for editing or converting. In Debian, it's one line.

# Convert mp3 to wav
ffmpeg -i track.mp3 track.wav

# Cut a fragment from a video
ffmpeg -ss 00:01:30 -to 00:02:45 -i input.mp4 -c copy output.mp4

# Compress video to a smaller size
ffmpeg -i bigfile.mp4 -vcodec libx265 -crf 28 compressed.mp4

# Bulk convert all wav to mp3
for f in *.wav; do ffmpeg -i "$f" "${f%.wav}.mp3"; done

I use these commands all the time. Video, music, screenshots — everything is processed instantly, without graphical interfaces. And the more you work with ffmpeg, the more you understand: this is a tool that replaces dozens of programs.

grep, sed, awk — working with text

Linux servers are always working with text: logs, configs, reports. grep, sed and awk have become real lifesavers for me. They allow you to quickly find what you need, change and process data.

# Find all lines with errors in logs
grep -i "error" /var/log/syslog

# Replace "oldtext" with "newtext" in all php files
sed -i 's/oldtext/newtext/g' *.php

# Show statistics on memory usage by processes
ps aux | awk '{print $6/1024 " MB\t" $11}' | sort -n

At first it seems too complicated. But once you try it, you realize that you can't do without these utilities.

Monitoring and Administration

The terminal allows you to monitor the system in real time. I can check the CPU temperature, monitor the load, analyze traffic. All this is done with a couple of commands.

# Checking the CPU temperature "lm-sensors"
sensors

# Monitoring network traffic "iftop"
iftop

# List of processes with CPU load
top

# top in a more colorful version, requires a separate installation from the repository
htop

# Checking disk usage
df -h

It is these tools that make Linux ideal for administration. You know what is happening in the system, and you can manage it at any second.

Philosophy of automation

Over time, I realized: automation in Linux is not a luxury, but a style of work. Every day can be made easier if you entrust the routine to the system. Write a script once, and it works for you for years. Set up cron once, and the task is solved without your participation. This saves time and effort. And most importantly, it feels like the computer really works for you and for you.

At some point, Linux stopped being just a working system on a laptop for me. I wanted to try more: set up my services, manage them, understand how it works at the server level. This is how a separate story began - the story of my mail server and my own website. These projects took time and effort, but they gave me a real feeling that Linux is a system without borders.

Why your own mail server

Previously, like many others, I used Gmail and other mail services. It is convenient, fast, familiar. But the more I worked with Linux servers, the more clearly I understood: my mail is stored by a third party, and they always have access to it. This was a big issue for me — I wanted to control the process myself. So I decided to set up my own mail server.

I'll be honest: it was a tough experience. I chose Postfix as my MTA and Dovecot as my IMAP and POP3 server. Setting up SPF, DKIM, and DMARC took more than one day. I read the documentation, tried different options, sometimes broke the configuration, and started from scratch. But in the end, the server worked, and for the first time, emails sent from my domain began to arrive without being marked as "spam." It was a real victory.

# Checking the mail server queue
sudo postqueue -p

# Deleting stuck emails
sudo postsuper -d ALL

# Logging mail server operation
tail -f /var/log/mail.log

Now I have a mail server that works the way I want it to. I know where my correspondence is stored, and I am sure that only I have access to it. Yes, this requires support, updates, and attention. But this is true independence. I am the master of my own mail.

Website as an experiment

The next step was my website. At first, it was a simple experiment: I wanted to check how the PHP and MySQL bundle works, how to set up virtual hosts in Nginx, how to connect SSL certificates. But gradually the website turned into a full-fledged project. I started publishing articles on it, developing tools, trying to integrate Python scripts.

Working with the website showed me how convenient Linux is for administering web servers. Everything is at hand: PHP packages, Nginx modules, tools for working with databases. Setting up backups, monitoring logs, optimizing the load - all this has become a natural part of the work.

# Checking Nginx errors in real time
tail -f /var/log/nginx/error.log

# Backing up the MySQL database
mysqldump -u root -p mydb > mydb_backup.sql

# Restarting the Nginx server after changing the configuration
sudo systemctl reload nginx

Administration experience

Working with the server turned out to be a school that gave me much more than I expected. I learned how to manage services, monitor security, update packages without the risk of breaking the system. Systemd has become a universal tool for me to control services: start, stop, check the status. All this turned out to be much more convenient and transparent than in Windows.

# Checking the service status
systemctl status nginx

# Starting or stopping
sudo systemctl start postfix
sudo systemctl stop dovecot

# Adding a service to startup
sudo systemctl enable nginx

Gradually I realized: Linux server administration is not about "complexities". It is about understanding how the system works. Each command becomes a tool, each change in the configuration is a step towards a better understanding of the processes. And the more I did it, the more I got carried away.

Why is it important

For some, a mail server or their own website is a trifle. But for me, it has become a symbol of independence. I no longer depend on other people's decisions: neither on corporations nor on third-party services. I know that my site works on my server, that my mail is stored where I want. This feeling of freedom cannot be overestimated.

Yes, it requires knowledge and time. Sometimes something breaks, sometimes you need to understand complex configurations. But this is where the value lies. Linux provides tools that allow you to fully manage your services. And when you see how everything works, how letters reach recipients, how the site opens via HTTPS - you understand that everything was not in vain.

At some point, Linux for me ceased to be just a working system on a laptop. I wanted to try more: set up my services, manage them, understand how it works at the server level. This is how a separate story began — the story of my mail server and my own website. These projects took time and effort, but they gave me a real feeling that Linux is a system without borders.

Why your own mail server

Previously, like many others, I used Gmail and other mail services. It is convenient, fast, familiar. But the more I worked with Linux servers, the clearer I understood: my mail is stored by a third party, and they always have access to it. For me, this became an important issue — I wanted to control this process myself. So I decided to set up my own mail server.

I will be honest: it was not an easy experience. I chose Postfix as MTA and Dovecot as a server for working with IMAP and POP3. Setting up SPF, DKIM and DMARC took more than one day. I read the documentation, tried different options, sometimes broke the configuration and started from scratch. But eventually the server started working, and for the first time, emails sent from my domain started to arrive without being marked as "spam". It was a real feeling of victory.

# Checking the mail server queue
postqueue -p

# Deleting stuck emails
postsuper -d ALL

# Logging mail server operation
tail -f /var/log/mail.log

Now I have a mail server that works the way I want it to. I know where my correspondence is stored, and I am sure that only I have access to it. Yes, this requires support, updates and attention. But this is true independence. I am the master of my own mail.

Website as an experiment

The next step was my website. At first it was a simple experiment: I wanted to check how the PHP and MySQL bundle works, how to set up virtual hosts in Nginx, how to connect SSL certificates. But gradually the site turned into a full-fledged project. I started publishing articles on it, developing tools, trying to integrate Python scripts.

Working with the site showed me how convenient Linux is for administering web servers. Everything is at hand: PHP packages, Nginx modules, tools for working with databases. Setting up backups, monitoring logs, optimizing the load - all this has become a natural part of the job.

# Checking Nginx errors in real time
tail -f /var/log/nginx/error.log

# Backing up the MySQL database
mysqldump -u root -p mydb > mydb_backup.sql

# Restarting the Nginx server after changing the configuration
sudo systemctl reload nginx

Administration experience

Working with the server turned out to be a school that gave me much more than I expected. I learned how to manage services, monitor security, update packages without the risk of breaking the system. Systemd became my universal tool for monitoring services: starting, stopping, checking the status. All this turned out to be much more convenient and transparent than in Windows.

# Checking the service status
systemctl status nginx

# Starting or stopping
sudo systemctl start postfix
sudo systemctl stop dovecot

# Adding a service to startup
sudo systemctl enable nginx

Gradually, I realized: Linux server administration is not about "complexities". It is about understanding how the system works. Each command becomes a tool, each change in the configuration is a step towards a better understanding of the processes. And the more I did it, the more I got carried away.

Why is it important

For some, a mail server or your own website is a trifle. But for me, it has become a symbol of independence. I no longer depend on other people's decisions: neither on corporations nor on third-party services. I know that my website works on my server, that my mail is stored where I want. This feeling of freedom cannot be overestimated.

Yes, it requires knowledge and time. Sometimes something breaks, sometimes you need to understand complex configurations. But this is where the value lies. Linux provides tools that allow you to fully manage your services. And when you see how everything works, how emails reach recipients, how the site opens via HTTPS - you understand that everything was not in vain.

When I was just starting my journey in Linux, it was all more of an experiment. But over time, the system became my main working environment, and then a tool for server administration. Gradually, I realized that there are three things that finally secured me in Linux: security, containerization, and open-source philosophy. They made Linux not just convenient for me, but the right choice.

Security as a Foundation

In Windows, security issues have always been addressed through antiviruses and additional programs. In Linux, security is built into the architecture itself: access rights, user separation, transparent operation of processes. This does not make the system absolutely impregnable, but it creates a foundation that can be trusted.

I gradually mastered the tools that help strengthen security. One of them is iptables. With its help, you can flexibly manage network connections: allow access only from certain IPs, close unnecessary ports, filter traffic. At first the commands seemed complicated, but over time I realized that the logic behind them is simple.

# View current rules
sudo iptables -L -v -n

# Allow access to SSH only from one IP
sudo iptables -A INPUT -p tcp -s 203.0.113.5 --dport 22 -j ACCEPT

# Deny all other connections to SSH
sudo iptables -A INPUT -p tcp --dport 22 -j DROP

Another important tool is fail2ban. It automatically blocks IP addresses from which attempts to log in to the system are too frequent. This simple measure really saves from brute force. Now, even if someone tries to guess the password, the server itself will ban such a "guest".

# Checking the fail2ban status
sudo fail2ban-client status

# Viewing the list of blocked IPs
sudo fail2ban-client status sshd

And, of course, logs. In Linux, they play a special role. Everything that happens in the system can be tracked in the logs. journalctl and tail allow you to see events in real time and react quickly.

# View the latest system events
journalctl -xe

# Logs of a specific service
journalctl -u nginx.service

# Monitor the mail server in real time
sudo tail -f /var/log/mail.log

Docker and containerization

Docker was a real discovery for me. Before that, I installed everything directly on the system, and sometimes it turned into chaos. But containers change the approach: each service works in its own isolated environment. You need to test the database - you launch a container. You want to quickly launch a web server - again a container. Everything is clean and tidy, and at the same time works in parallel without conflicts.

# Starting a container with MySQL
docker run --name mydb -e MYSQL_ROOT_PASSWORD=pass -d mysql:latest

# Starting the Nginx web server on port 8080
docker run --name web -p 8080:80 -d nginx

# Viewing running containers
docker ps

# Removing all stopped containers
docker container prune

With Docker, I was able to experiment without fear of "breaking" something in the system. Everything works separately, and if something goes wrong, the container can be deleted and a new one can be launched. This turned out to be not only convenient, but also gave a sense of freedom for experiments.

systemd and service management

Another powerful tool for me was systemd. With its help, you can manage services: start, stop, restart. And most importantly, create your own. For example, I set up automatic restart of the Python script that monitors the state of my site. This is convenient and reliable: if the script terminates for some reason, systemd will start it again.

# Example of a unit file for systemd
[Unit]
Description=Website Monitor

[Service]
ExecStart=/usr/bin/python3 /home/user/scripts/check_site.py
Restart=always

[Install]
WantedBy=multi-user.target

Now I have full control over service management. I can start them manually, add them to startup, configure restarts on failures. And all this in a few commands.

Open-source philosophy

But the most important thing I discovered in Linux is the open-source philosophy. It manifests itself in everything: from small utilities to large projects like Debian. Everything here is built on openness and community. I can download a program for free, look at its source code, understand how it works. This gives trust. Even if something goes wrong, I know: you can always find a solution in the community or write your own.

Open-source for me has become not only about software, but also about a way of thinking. It is about freedom of choice, about the ability to control your data and your computer. I no longer feel that the system makes decisions for me. I manage it myself, and this is what makes Linux a natural choice for me.

Raspberry Pi as a continuation of this story

When I finally got used to Linux, I wanted to expand the boundaries of its application. So I got a Raspberry Pi 5. This small but powerful device became a separate project and a new level of experiments. I installed minidlna on it and made a home media center. Now movies and music are available on all devices in the home network, and the laptop itself does not need to be constantly turned on. When idle, the Raspberry Pi performs other tasks: it runs programs in non-stop mode, which do not need the main computer. This saves electricity and makes the operation of the equipment more rational.

I connected a webcam to the Raspberry Pi. Now I can turn it on remotely and watch the house when I'm not around. Sometimes it's just fun to see what the cat is doing while the owner is not around. And sometimes it's a security element. All this works through an SSH tunnel from the Raspberry Pi to the server. For recording, I use the motion program: it records video only when motion is detected. This means that only useful information is saved on the disk, and not hours of idle time.

Raspberry Pi has become an example for me of how Linux connects different devices into a single ecosystem. SSH allows hardware to "talk" to each other, and I feel that there are more and more development options in this area. Every day you can come up with a new scenario: automation, surveillance, server tasks, smart home management. This is an endless field for experiments that will never get boring. After getting acquainted with Linux, I realized: there is always something to do, there is always room for development. And this is its main strength.

Conclusion

Linux has given me three things that I value most: security, flexibility, and openness. With iptables and fail2ban, I protect my services. With Docker, I can experiment and launch new projects without risk to the system. With systemd, I manage processes the way I like. And the open-source philosophy gives me confidence that I'm using a system that I can trust. For me, it's not just an operating system - it's a full-fledged environment for work, creativity and experiments. And that's why I settled on Linux.