SimRacing on Linux: A New Reality - BoxThisLap
What Happened After Google Retrofitted Memory Safety Onto Its C++ Codebase?
Read more of this story at Slashdot.
New Pentagon Report on UFOs: Hundreds of New Incidents, No Evidence of Aliens
Read more of this story at Slashdot.
8 Escaped Monkeys Remain at Large, Now Joined By Two Fugitive Emus
Read more of this story at Slashdot.
Does Google Plan to Create Email Aliases for Apps to Fight Spam?
Read more of this story at Slashdot.
Distribution Release: Red Hat Enterprise Linux 9.5
Distribution Release: RELIANOID 7.5
Distribution Release: CachyOS 241110
DistroWatch Weekly, Issue 1096
Review: Bazzite 40, Playtron OS Alpha 1, Tucana Linux 3.1
News: Redox being ported to RISC-V and running the COSMIC software centre, TrueNAS talks about new ZFS features, FreeBSD booting on the PinePhone Pro, Debian refreshes install media, LXQt supports multiple Wayland window managers
Questions....
What’s KernelCare?
This article explains all that you need to know about KernelCare. But before studying about KernelCare, let’s do a quick recap of the Linux kernel. It’ll help you understand KernelCare better. The Linux kernel is the core part of Linux OS. It resides in memory and prompts the CPU what to do.
Now let’s begin with today’s topic which is KernelCare. And if you’re a system administrator this article is going to present valuable information for you.
What is KernelCare?So, what’s KernelCare? KernelCare is a patching service that offers live security updates for Linux kernels, shared libraries, and embedded devices. It patches security vulnerabilities inside the Linux kernel without creating service interruptions or any downtime. Once you install KernelCare on the server, security updates automatically get applied every 4 hours on your server. It dismisses the need for rebooting your server after making updates.
It is a commercial product and is licensed under GNU GPL version 2. Cloud Linux, Inc developed this product. The first beta version of KernelCare was released in March 2014 and its commercial launch was in May 2014. Since then they have added various useful integrations for automation tools, vulnerability scanners, and others.
Operating systems supported by KernelCare include CentOS/RHEL 5, 6, 7; Cloud Linux 5, 6; OpenVZ, PCS, Virtuozzo, Debian 6, 7; and Ubuntu 14.04.
Is KernelCare Important?Are you wondering if KernelCare is important for you or not? Find out here. By installing the latest kernel security patches, you are able to minimize potential risks. When you try to update the Linux kernel manually, it may take hours. Apart from the server downtime, it can be a stressful job for the system admins and also for the clients.
Once the kernel updates are applied, the server needs a reboot. This is usually done during off-peak work hours. And this causes some additional stress. However, ignoring server reboots can cause a whole lot of security issues. It’s seen that, even after rebooting, the server experiences issues and doesn’t easily come back up. Fixing such issues is a trouble for the system admins. Often the system admin needs to roll back all the applied updates to get the server up quickly.
With KernelCare, you can avoid such issues.
How Does KernelCare Work?KernelCare eliminates non-compliance and service interruptions caused by system reboots. KernelCare agent resides on your server. It periodically checks for new updates. In case it finds any, the agent downloads those and applies them to the running kernel. A KernelCare patch can be defined as a piece of code that’s used to substitute buggy code in the kernel.
Go to Full ArticleGetting Started with Docker Semi-Self-Hosting on Linode
With the evolution of technology, we find ourselves needing to be even more vigilant with our online security every day. Our browsing and shopping behaviors are also being continuously tracked online via tracking cookies being dropped on our browsers that we allow by clicking the “I Accept” button next to deliberately long agreements on websites before we can get the full benefit of said site.
Watch this article:
Additionally, hackers are always looking for a target and it's common for even big companies to have their servers compromised in any number of ways and have sensitive data leaked, often to the highest bidder.
These are just some of the reasons that I started looking into self-hosting as much of my own data as I could.
Because not everyone has the option to self-host on their own, private hardware, whether it's for lack of hardware, or because their ISP makes it difficult or impossible to do so, I want to show you what I believe to be the next best step, and that's a semi-self-hosted solution on Linode.
Let's jump right in!
Setting up a LinodeFirst things first, you’ll need a Docker server set up. Linode has made that process very simple and you can set one up for just a few bucks a month and can add a private IP address (for free) and backups for just a couple bucks more per month.
Get logged into your Linode account click on "Create Linode".
Don't have a Linode account? Get $100 in credit clicking here
On the "Create" page, click on the "Marketplace" tab and scroll down to the "Docker" option. Click it.
With Docker selected, scroll down and close the "Advanced Options" as we won't be using them.
Below that, we'll select the most recent version of Debian (version 10 at the time of writing).
In order to get the the lowest latency for your setup, select a Region nearest you.
When we get to the "Linode Plan" area, find an option that fits your budget. You can always start with a small plan and upgrade later as your needs grow.
Next, enter a "Linode Label" as an identifier for you. You can enter tags if you want.
Enter a Root Password and import an SSH key if you have one. If you don't that's fine, you don't need to use an SSH key. If you'd like to generate one and use it, you can find more information about how to do so here "Creating an SSH Key Pair and Configuring Public Key Authentication on a Server").
Go to Full ArticleManage Java versions with SDKMan
Java is more than just a programming language: It's also a runtime.
Applications written in Java are compiled to Java bytecode then interpreted by a Java Virtual Machine (JVM), which is why you can write Java on one platform and have it run on all other platforms.
A challenge can arise, however, when a programming language and an application develop at different rates. It's possible for Java (the language) to increment its version number at the same time your favorite application continues to use an older version, at least for a while.
If you have two must-have applications, each of which uses a different version of Java, you may want to install both an old version and a new version of Java on the same system. If you're a Java developer, this is particularly common, because you might contribute code to several projects, each of which requires a different version of Java.
The SDKMan project makes it easy to manage different versions of Java and related languages, including Groovy, Scala, Kotlin, and more.
SDKMan is like a package manager just for versions of Java.
More on Java What is enterprise Java programming? Red Hat build of OpenJDK Java cheat sheet Free online course: Developing cloud-native applications with microservices arc… Fresh Java articles Install SDKManSDKMan requires these commands to be present on your system:
- zip
- unzip
- curl
- sed
On Linux, you can install these using your package manager. On Fedora, CentOS Stream, Mageia, and similar:
$ sudo dnf install zip unzip curl sedOn Debian-based distributions, use apt instead of dnf. On macOS, use MacPorts or Homebrew. On Windows, you can use SDKMan through Cygwin or WSL.
Once you've satisfied those requirements, download the SDKMan install script:
$ curl "https://get.sdkman.io" --output sdkman.shTake a look at the script to see what it does, and then make it executable and run it:
$ chmod +x sdkman.sh$ ./sdkman.shConfigure
When the installation has finished, open a new terminal, or run the following in the existing one:
source "~/.sdkman/bin/sdkman-init.sh"Confirm that it's installed:
$ sdk versionInstall Java with SDKManNow when you want to install a version of Java, you can do it using SDKMan.
First, list the candidates for Java available:
$ sdk list java=================================================
Available Java Versions for Linux 64bit
=================================================
Vendor | Version | Dist | Identifier
-------------------------------------------------
Gluon | 22.0.0.3.r17 | gln | 22.0.0.3.r17-gln
| 22.0.0.3.r11 | gln | 22.0.0.3.r11-gln
GraalVM | 22.0.0.2.r17 | grl | 22.0.0.2.r17-grl
| 21.3.1.r17 | grl | 21.3.1.r17-grl
| 20.3.5.r11 | grl | 20.3.5.r11-grl
| 19.3.6.r11 | grl | 19.3.6.r11-grl
Java.net | 19.ea.10 | open | 19.ea.10-open
| 18 | open | 18-open
| 17.0.2 | open | 17.0.2-open
| 11.0.12 | open | 11.0.12-open
| 8.0.302 | open | 8.0.302-open
[...]
This provides a list of different Java distributions available across several popular vendors, including Gluon, GraalVM, OpenJDK from Java.net, and many others.
You can install a specific version of Java using the value in the Identifier column:
$ sdk install java 11.0.12-openThe sdk command uses tabbed completion, so you don't need to view a list. Instead you can type sdk install java 11 and then press Tab a few times to get the options.
Alternately, you can just install the default latest version:
$ sdk install javaSet your current version of JavaSet the version of Java for a terminal session with the use subcommand:
$ sdk use java 17.0.2-openTo set a version as default, use the default subcommand:
$ sdk default java 17.0.2-openGet the current version in effect using the current subcommand:
$ sdk current java Using java version 17.0.2-openRemoving Java with SDKManYou can remove an installed version of Java using the uninstall subcommand:
$ sdk uninstall java 11.0.12-openMore SDKManYou can do more customization with SDKMan, including updating and upgrading Java versions and creating project-based environments. It's a useful command for any developer or user who wants the ability to switch between versions of Java quickly and easily.
If you love Java, or use Java, give SDKMan a try. It makes Java easier than ever!
The SDKMan project makes it easy to manage different versions of Java and related languages, including Groovy, Scala, Kotlin, and more.
Image by:Image by WOCinTech Chat, CC BY 2.0
Java What to read next This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. 6482 points (Correspondent) Vancouver, CanadaSeldom without a computer of some sort since graduating from the University of British Columbia in 1978, I have been a full-time Linux user since 2005, a full-time Solaris and SunOS user from 1986 through 2005, and UNIX System V user before that.
On the technical side of things, I have spent a great deal of my career as a consultant, doing data analysis and visualization; especially spatial data analysis. I have a substantial amount of related programming experience, using C, awk, Java, Python, PostgreSQL, PostGIS and lately Groovy. I'm looking at Julia with great interest. I have also built a few desktop and web-based applications, primarily in Java and lately in Grails with lots of JavaScript on the front end and PostgreSQL as my database of choice.
Aside from that, I spend a considerable amount of time writing proposals, technical reports and - of course - stuff on https://www.opensource.com.
Open Sourcerer People's Choice Award 100+ Contributions Club Emerging Contributor Award 2016 Correspondent Columnist Contributor Club Author Comment Gardener Register or Login to post a comment.Open exchange, open doors, open minds: A recipe for global progress
Could open organization principles successfully apply to entire societies?
That's the question I asked as I read the book Open: The Story of Human Progress by Johan Norberg, which aims to examine the relative success of "open societies" throughout global history.
Learn about open organizations Download resources Join the community What is an open organization? How open is your organization?In this review—the first article in an extended discussion of the work from Open Organization community members—I will summarize more precisely what Norberg means when he uses the term "open" and offer an initial assessment of his arguments. Ultimately, however, our discussion will explore more expansive themes, like:
- the importance of open societies,
- what the future could (or should) look like in a more open world, and
- how these principles impact our collective understanding of how organizations operate in service of "the greater good"
Essentially, Norberg is looking at four dimensions of "open," which he calls:
- "open exchange" (global goods and service flows across borders),
- "open doors" (global movement of people),
- "open minds" (global receptivity to new and different ideas), and
- "open societies" (how cultures should be governed to benefit from the above three)
Let me discuss each one more extensively.
Open exchangeNorberg uses the phrase "open exchange" to refer to the movement of goods and services not just across borders but within them as well. Simply put, he believes that people across the world prosper when trade increases, because increased trade leads to increased cooperation and sharing.
His argument goes like this: when a nation (and to be sure, Norberg aims his advice at contemporary nation-states) allows and includes foreign goods into their market, in general they also gain expertise, skills, and knowledge, too. Surplus goods/services that one may have should be sold anywhere they might provide value and add benefit for someone else—and those benefits might include, for example, favors, ideas, knowledge, not just goods and services themselves. Reciprocity and relatively equal exchange is for Norberg an unavoidable aspect of human nature, as it builds binding relationships that promote more generosity. Generosity in turn promotes more trade, creating a cycle of prosperity for all involved.
This view holds for organizations working with uncommon trade partners as well. Greater organizational specificity leads to the need for more cooperation and sharing, which leads to even more specialization. So here we can see a link between open societies and open organizations regarding trade issues.
Open doorsFor Norberg, "open doors" refers to people's ability to move across national borders, for one reason or another. He believes the gradual inclusion of foreigners into a society leads to more novel and productive interactions, which leads to greater innovation, more ideas, and more rapid discoveries. For a society to be productive, it must get the right talent performing the right tasks. Norberg argues that there should be no barriers to that match-up, and people should be mobile, even across borders, so they can achieve it.
Norberg outlines how, throughout history, diverse groups of people solve problems more effectively—even if they create more friction as they do so, as members have their assumptions questioned. This kind of open environment must be promoted, supported, and managed, however, in order to avoid groupthink, the predominance of voices that are merely the loudest, and the outsized influence of niche interests.
Critical to the success of "open doors" are recognition, respect, understanding, acceptance, and inclusivity toward others. Norberg discusses the importance of these qualities, citing the World Values Survey, which measures some of them. Done well, open doors can allow societies to cross-fertilize, borrowing ideas and technology from each other and multiplying that which works best.
We could say that's equally true for an organization wanting to develop a new product or market, too.
Open minds"Open economies stimulate open-mindedness," Norberg writes. For him, "open minds" are those receptive to thoughts and belief systems that may seem different, foreign, or alien to them—those that both offer and receive different perspectives. Open minds, Norberg claims, lead to more rapid progress.
Open minds flourish when given the space to encounter new ideas and explore them freely—rather than, say, simply accept the given dogma of an age. According to Norberg, people from a wide range of disciplines, specialties, and skills coming together and sharing their perspectives stimulates growth and progress. But this is only possible when they exist in an environment where they feel free to question the status quo and possibly overturn long-standing beliefs. Barriers to creating those environments certainly exist (in fact, the entire second half of Norberg's book offers a deeper analysis of them).
Open minds flourish when given the space to encounter new ideas and explore them freely—rather than, say, simply accept the given dogma of an age.Of course this is true in organizations as well. The more people (and the more different people) who look at a problem, the better. This not only leads to faster solutions but helps overcome anyone's individual biases. Serendipitous solutions to problems can seemingly come out of nowhere more often, as there will be better and more peer review of strongly held positions. And yet differences create friction, so standards of protocol and behavior are required to ensure progress.
For Norberg, the world benefits when scientists, philosophers, industrialists, and craftspeople can influence one another's thinking (and are receptive to having their thinking changed!). The same is true in open organizations when people with different roles and functions can work together and enrich one another's thinking. More experiments and greater collaboration among disciplines lead to richer discoveries.
Open Organization resources Download resources Join the community What is an open organization? How open is your organization? Open societiesCombining open minds, open exchange, and open doors can lead to fully open societies globally, Norberg argues, and "the result is discoveries and achievements." Governments, he asserts, should work to foster those kinds of societies across the globe. In this way, societies can tap into the greatest talent from the entire global community.
According to Norberg, more inclusive societies based on these open policies can lead to material gains for people—fewer hours working, the ability to launch careers earlier (or retire earlier), longer lives in general, and more. This is not to mention reductions in extreme poverty, child and maternal mortality, and illiteracy globally. On top of that, for Norberg global cultural collaboration leads to better utilization of ecological, natural, and environmental resources. All this can be achieved through specific specialties that advance societies at an exponential rate though openness.
Open makes a historical argument. Norberg believes that throughout the ages it was not defenders of tradition that prospered most. Instead, those thinkers, engineers, and philosophers that challenged the status quo made the greatest contribution to global prosperity. Those figures benefitted from societies that were more open to improvements because they governed their own experiments, fostered rapid feedback loops, and built systems that quickly self-correct during setbacks.
Yet like any history, Norberg's is partial and selective, presenting isolated cases and examples. And some of those include even the most brutal empires, whose violence Norberg tends to overlook. In future parts of this review, we'll dive more deeply into various aspects of Norberg's analysis—and discuss its implications for thinking about a more open future.
Openness, this new book argues, has always been a necessary cornerstone of human civilization.
Image by:Opensource.com
The Open Organization What to read next This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.22 Raspberry Pi projects to try in 2022
The possibilities for Raspberry Pi projects continue to perpetuate this Pi Day! The beloved single-board computer recently turned ten years old. To celebrate, we put together a list of recent Raspberry Pi tutorials written by members of the Opensource.com community.
More on Raspberry Pi What is Raspberry Pi? eBook: Guide to Raspberry Pi Getting started with Raspberry Pi cheat sheet eBook: Running Kubernetes on your Raspberry Pi Whitepaper: Data-intensive intelligent applications in a hybrid cloud blueprint Understanding edge computing Our latest on Raspberry Pi 10 Raspberry Pi projects for your homeThe Raspberry Pi is ripe for DIY projects for the home. Why risk your data with a proprietary home automation tool when you can take full control with a $35 computer? Opensource.com authors have shared how they've built thermostats, monitored their home climate, set parental controls, and much more in the following tutorials.
- Build a home thermostat with a Raspberry Pi The ThermOS project is an answer to the many downsides of off-the-shelf smart thermostats.
- Monitor your home's temperature and humidity with Raspberry Pis and Prometheus Instrument a Prometheus application with Python on Raspberry Pis to collect temperature sensor data.
- Set up temperature sensors in your home with a Raspberry Pi Find out how hot your house is with a simple home Internet of Things project.
- Build a router with mobile connectivity using Raspberry Pi Use OpenWRT to get more control over your network's router.
- Troubleshoot WiFi problems with Go and a Raspberry Pi Build a WiFi scanner for fun.
- Set up network parental controls on a Raspberry Pi With minimal investment of time and money, you can keep your kids safe online.
- Monitor your greenhouse with CircuitPython and open source tools Keep track of your greenhouse's temperature, humidity, and ambient light using a microcontroller, sensors, Python, and MQTT.
- Collect sensor data with your Raspberry Pi and open source tools Learning more about what is going on in your home is not just useful; it's fun!
- Measure your Internet of Things with Raspberry Pi and open source tools Setting up an environment-monitoring system demonstrates how to use open source tools to keep tabs on temperature, humidity, and more.
- Track your family calendar with a Raspberry Pi and a low-power display Help everyone keep up with your family's schedule using open source tools and an E Ink display.
You can be productive without a ton of fancy tools. Whether you want to host your personal blog or start crypto trading with a reduced carbon footprint, the Raspberry Pi has you covered.
- Host your website with dynamic content and a database on a Raspberry Pi You can use free software to support a web application on a very lightweight computer.
- Use your Raspberry Pi as a productivity powerhouse The Raspberry Pi has come a long way from being primarily for hacking and hobbyists to a solid choice for a small productive workstation.
- Run your blog on a Raspberry Pi I set up a Raspberry Pi to act as a web server to host my personal blog on Drupal.
- Use your Raspberry Pi as a data logger Here's how to log the CPU temperature of a Raspberry Pi and create a spreadsheet-based report on demand.
- Convert your Raspberry Pi into a trading bot with Pythonic Reduce your power consumption by setting up your cryptocurrency trading bot on a Raspberry Pi.
The Raspberry Pi is probably most famous for its serious use case of fun! The Pi offers lots of options for tinkering with Linux, learning about computers, or celebrating your favorite holiday.
- Create a countdown clock with a Raspberry Pi Start counting down the days to your next holiday with a Raspberry Pi and an ePaper display.
- Track aircraft with a Raspberry Pi Explore the open skies with a Raspberry Pi, an inexpensive radio, and open source software.
- Control your Raspberry Pi remotely with your smartphone Control the GPIOs of your Raspberry Pi remotely with your smartphone.
- Build a programmable light display on Raspberry Pi Celebrate the holidays or any special occasion with a DIY light display using a Raspberry Pi, Python, and programmable LED lights.
- Make an automated Jack-o'-lantern with a Raspberry Pi Here's my recipe for the perfect pumpkin Pi.
- Cast your Android device with a Raspberry Pi Use Scrcpy to turn your phone screen into an app running alongside your applications on a Raspberry Pi or any other Linux-based device.
- Learn everything about computers with this Raspberry Pi kit The CrowPi is an amazing Raspberry Pi project system housed in a laptop-like body.
Go ahead and mark your calendar for trying out a few of these creative Raspberry Pi projects this year.
Celebrate Pi Day by checking out these creative and useful Raspberry Pi projects.
Image by:Dwight Sipler on Flickr
Raspberry Pi What to read next Build a router with mobile connectivity using Raspberry Pi How I run my blog on a Raspberry Pi Control your Raspberry Pi remotely with your smartphone This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.Collect sudo session recordings with the Raspberry Pi
I've used the sudo command for years, and one of my favorite features is how it saves a record of everything happening in a terminal while running a command. This feature has been available for over a decade. However, sudo 1.9 introduced central session recording collection, allowing you to check all administrative access to your hosts on your network at a single location and play back sessions like a movie.
I use this feature on my Raspberry Pi, and I recommend it to other Pi users. Even if you fully trust your users, logs and session recordings can help debug what happened on a given host if it acts strangely: Oops, wrong file deleted in /etc.
More on Raspberry Pi What is Raspberry Pi? eBook: Guide to Raspberry Pi Getting started with Raspberry Pi cheat sheet eBook: Running Kubernetes on your Raspberry Pi Whitepaper: Data-intensive intelligent applications in a hybrid cloud blueprint Understanding edge computing Our latest on Raspberry Pi Why sudo?Sudo gives administrative access to users. Unless you limit access to a short list of commands, you practically provide full access to your hosts. The pi user can use sudo without even entering a password on the Raspberry Pi OS. On other operating systems, the default configuration grants members of the wheel group full administrative access.
Before you beginThe new sudo_logsrvd application handles collection. Earlier versions of the Raspberry Pi OS only had sudo version 1.8. The latest version is based on Debian 11 and includes sudo version 1.9.5. You also need a second host with sudo 1.9, which sends recordings to sudo_logsrvd.
Configuring sudo_logsrvdFor a production environment, I recommend using TLS encrypted connections between sudo and sudo_logsrvd. However, to simply understand how session recording works, you can go without encryption. This also means that there is nothing to configure other than creating the storage directory and starting sudo_logsrvd:
$ sudo mkdir /var/log/sudo-io$ sudo chmod 700 /var/log/sudo-io
$ sudo sudo_logsrvd
The sudo_logsrvd is now waiting for connections.
Configuring sudoConfigure sudo 1.9 on a host using visudo and append the following lines to the sudoers file. You will need to replace the IP address with the one of your Raspberry Pi. Note that if you do not have a second machine with sudo 1.9, you can use the same Raspberry Pi running sudo_logsrvd for testing.
Defaults ignore_iolog_errorsDefaults log_servers = 172.16.167.129:30343
Defaults log_output
The first line is your escape route while experimenting with sudo_logsrvd: It ensures that sudo works even if sudo_logsrvd is inaccessible. This configuration is not recommended for production environments as users can execute commands without proper recording.
The next two lines configure where to send recordings and enable recordings.
TestingFor testing, do something that you cannot figure out from sudo logs in syslog: A shell session. Be aware that sudo 1.9.8 changes this, but it is not yet available in Linux distributions. In this case, the logs show only that a shell is started, but nothing about what happened inside:
$ sudo -s# id
uid=0(root) gid=0(root) groups=0(root),117(lpadmin)
# cd /root/
# ls -la
total 36
drwx------ 5 root root 4096 Feb 16 12:27 .
drwxr-xr-x 18 root root 4096 Jan 28 04:22 ..
-rw------- 1 root root 827 Feb 16 12:49 .bash_history
-rw-r--r-- 1 root root 571 Apr 10 2021 .bashrc
drwx------ 3 root root 4096 Feb 16 10:54 .cache
-rw------- 1 root root 41 Feb 16 11:12 .lesshst
drwxr-xr-x 3 root root 4096 Feb 16 12:27 .local
-rw-r--r-- 1 root root 161 Jul 9 2019 .profile
drwx------ 3 root root 4096 Jan 28 04:21 .vnc
# exit
$
Even if the logs do not show anything useful, you can still use the sudoreplay command to list and playback recordings:
$ sudo sudoreplay -lFeb 16 12:37:54 2022 : pi : TTY=/dev/pts/1 ; CWD=/home/pi ; USER=root ; HOST=raspberrypi ; TSID=000001 ; COMMAND=/usr/bin/ls -l /etc/ssl/private/
Feb 16 12:38:14 2022 : pi : TTY=/dev/pts/1 ; CWD=/home/pi ; USER=root ; HOST=raspberrypi ; TSID=000002 ; COMMAND=/usr/bin/ls -la /etc/ssl/private/
Feb 16 12:49:21 2022 : pi : TTY=/dev/pts/1 ; CWD=/home/pi ; USER=root ; HOST=raspberrypi ; TSID=000003 ; COMMAND=/bin/bash
Feb 16 12:50:03 2022 : pi : TTY=/dev/pts/1 ; CWD=/home/pi ; USER=root ; HOST=raspberrypi ; TSID=000004 ; COMMAND=/bin/bash
Feb 16 12:50:28 2022 : pi : TTY=/dev/pts/1 ; CWD=/home/pi ; USER=root ; HOST=raspberrypi ; TSID=000005 ; COMMAND=/usr/bin/sudoreplay -l
$ sudo sudoreplay 000004
Replaying sudo session: /bin/bash
# id
uid=0(root) gid=0(root) groups=0(root),117(lpadmin)
# cd /root/
# ls -la
total 36
drwx------ 5 root root 4096 Feb 16 12:27 .
drwxr-xr-x 18 root root 4096 Jan 28 04:22 ..
-rw------- 1 root root 827 Feb 16 12:49 .bash_history
-rw-r--r-- 1 root root 571 Apr 10 2021 .bashrc
drwx------ 3 root root 4096 Feb 16 10:54 .cache
-rw------- 1 root root 41 Feb 16 11:12 .lesshst
drwxr-xr-x 3 root root 4096 Feb 16 12:27 .local
-rw-r--r-- 1 root root 161 Jul 9 2019 .profile
drwx------ 3 root root 4096 Jan 28 04:21 .vnc
# exit
$What is next?
I hope you learned something new today and will try it on your own Raspberry Pi. The setup I described here is good enough for testing. For production use, I recommend creating a startup script for sudo_logsrvd, which is missing from the Debian package, and you should use TLS between sudo and sudo_logsrvd. You can learn more about configuring TLS encryption from the documentation or my blog. The nice thing is that you can also use sudo_logsrvd on the Raspberry Pi in production in your home or small office. Unless you have dozens of sudo clients all utilizing the terminal heavily (like ls -laR /), not even the SD card of the Pi is a bottleneck.
Logs and session recordings can help debug what happened on a given host if it acts strangely. Try this setup on your Raspberry Pi.
Raspberry Pi Sysadmin What to read next 22 Raspberry Pi projects to try in 2022 This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.Use your Raspberry Pi as a data logger
Data logging can be done for various reasons. In a previous article, I wrote about how I monitor the electricity consumption of my household. The Raspberry Pi platform is a perfect match for such applications as it allows communication with many kinds of analog and digital sensors. This article shows how to log the CPU temperature of a Raspberry Pi and create a spreadsheet-based report on demand. Logging the CPU temperature won't require any additional boards or sensors.
Even without a Raspberry Pi, you can follow the steps described here if you replace the specific parts of the code.
More on Raspberry Pi What is Raspberry Pi? eBook: Guide to Raspberry Pi Getting started with Raspberry Pi cheat sheet eBook: Running Kubernetes on your Raspberry Pi Whitepaper: Data-intensive intelligent applications in a hybrid cloud blueprint Understanding edge computing Our latest on Raspberry Pi SetupThe code is based on Pythonic, a graphical Python programming fronted. The easiest way to get started with Pythonic is to download and flash the Raspberry Pi image. If you don't have a Raspberry Pi, use one of the other installation methods mentioned on the GitHub page (e.g., Docker or Pip).
Once installed, connect the Raspberry Pi to the local network. Next, open the web-based GUI in a browser by navigating to http://pythonicrpi:7000/.
You should now see the following screen:
Download and unzip the example available on GitHub. The archive consists of several file types.
Use the green-marked button to upload the current_config.json, with the yellow-marked button upload the XLSX file and the remaining *.py files.
You should have this configuration in front of you after you upload the files:
Implementation
The application can be separated into two logical parts: Logging and report generation. Both parts run independently from each other.
LoggingThe top part of the configuration can be summarized as the logging setup:
Involved elements:
- ManualScheduler - 0x0412dbdc: Triggers connected elements on startup (or manually).
- CreateTable - 0x6ce104a4: Assembles an SQL query which creates the working table (if not already existent).
- Scheduler - 0x557616c2: Triggers subsequent element every 5 seconds.
- DataAcquisition - 0x0e7b8360: Here we collect the CPU temperature and assemble an SQL query.
- SQLite - 0x196f9a6e: Represents an SQLite database, accepts the SQL queries.
I will take a closer look at DataAcquisition - 0x0e7b8360. Open the built-in web editor (code-server) by navigating to http://pythonicrpi:8000/. You can see all the element-related *.py files in the left pane. The DataAcquisition element is based on the type Generic Pipe. Open the file with the related id:
generic_pipe_0x0e7b8360.py
In this element, responsible for reading the CPU temperature, you can uncomment the lines of code depending on whether you're running this on a Raspberry Pi or not.
The above code produces an SQL query that inserts a row in the table my_table containing the Unix timestamp in seconds and the actual CPU temperature (or a random number). The code is triggered every five seconds by the previous element (Scheduler - 0x557616c2). The SQL query string is forwarded to the connected SQLite - 0x196f9a6e element, which applies the query to the related SQLite database on the file system. The process logs the CPU temperature in the database with a sampling rate of 1/5 samples per second.
Report generationThe bottom network generates a report on request:
Involved elements:
- ManualScheduler - 0x7c840ba9: Activates the connected Telegram bot on startup (or manually).
- Telegram - 0x2e4148e2: Telegram bot which serves an interface for requesting and providing of reports.
- GenericPipe- 0x2f78d74c: Assembles an SQL query comprising the data of the report.
- SQLite - 0x5617d487:
- ReportGenerator- 0x13ad992a: Create a XLSX-based report based on the data.
The example code contains a spreadsheet template (report_template.xlsx) which also belongs to this configuration.
Note: To get the Telegram bot running, provide a Telegram bot token to communicate with the server. core.telegram.org describes the process of creating a bot token.
The Telegram element outputs a request as a Python string when a user requests a report. The GenericPipe- 0x2f78d74c element that receives the request assembles a SQL query which is forwarded to the SQLite - 0x5617d487 element. The actual data, which is read based on the SQL query, is now sent to the ReportGenerator- 0x13ad992a, which I will take a closer look at:
generic_pipe_13ad992a.py
def execute(self):path = Path.home() / 'Pythonic' / 'executables' / 'report_template.xlsx'
try:
wb = load_workbook(path)
except FileNotFoundError as e:
recordDone = Record(PythonicError(e), 'Template not found')
self.return_queue.put(recordDone)
con.close()
return
except Exception as e:
recordDone = Record(PythonicError(e), 'Open log for details')
self.return_queue.put(recordDone)
con.close()
return
datasheet = wb['Data']
# create an iterator over the rows in the datasheet
rows = datasheet.iter_rows(min_row=2, max_row=999, min_col=0, max_col=2)
In the first part, I use the load_workbook() of the openpyxl library to load the spreadsheet template. If successfully loaded, I acquire a reference to the actual sheet in the datasheet variable. Afterward, I create an iterator over the rows in the datasheet, which is stored in the variable rows.
# Convert unix time [s] back into a datetime object, returns an iteratorreportdata_dt = map(lambda rec: (datetime.datetime.fromtimestamp(rec[0]), rec[1]), self.inputData)
# iterate till the first iterator is exhausted
for (dt, val), (row_dt, row_val) in zip(reportdata_dt, rows):
row_dt.value = dt
row_val.value = val
reportDate = datetime.datetime.now().strftime('%d_%b_%Y_%H_%M_%S')
filename = 'report_{}.xlsx'.format(reportDate)
filepath = Path.home() / 'Pythonic' / 'log' / filename
wb.save(filepath)
wb.close()
recordDone = Record(filepath, 'Report saved under: {}'.format(filename))
self.return_queue.put(recordDone)
The last part starts with the variable reportdata_dt: The variable holds an iterator which, when used, converts the raw Unix timestamp of the input data from the SQLite database (self.inputdata) back to a Python datetime object. Next, I zip the reportdata_dt iterator with the previously created rows iterator and iterate till the first of them is exhausted, which should be reportdata_dt. During iteration, I fill the columns of each row with the timestamp and the value. In the last step, I save the spreadsheet with a filename consisting of the actual date and time and forward the filename to the Telegram - 0x2e4148e2 element.
The Telegram - 0x2e4148e2 then loads the file from disk back into memory and sends it to the user who requested the report. This video shows the whole procedure:
Remote video URL
The report the user receives look like this:
Wrap up
This article shows how to easily convert the Raspberry Pi into a data logger. The Raspberry Pi platform allows you to interact with sensors of any kind, enabling you to monitor physical values as well as computed values. Using spreadsheets as the basis for your reports gives you a lot of flexibility and makes those reports very customizable. The openpyxl library in combination with Pythonic makes it simple to automate this process.
Here's how to log the CPU temperature of a Raspberry Pi and create a spreadsheet-based report on demand.
Image by:Opensource.com
Raspberry Pi What to read next 22 Raspberry Pi projects to try in 2022 How I run my blog on a Raspberry Pi Collect sudo session recordings with the Raspberry Pi This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.Understanding the Digital World: My honest book review
I read a lot of books. I especially like to read books about computers, Linux, and the digital world we live in. I also enjoy reading books on the history of computing about and by and the people who helped make this digital world what it is today.
Imagine my excitement when I discovered the new second edition of an important book by Brian W. Kernighan, one of the leading figures in the creation of Unix, author or co-author of many influential books, and a professor of Computer Science at Princeton University. Understanding the Digital World combines computer history, technology, and personal story, along with discussions about how today's technology impacts our privacy.
More Linux resources Linux commands cheat sheet Advanced Linux commands cheat sheet Free online course: RHEL technical overview Linux networking cheat sheet SELinux cheat sheet Linux common commands cheat sheet What are Linux containers? Our latest Linux articlesKernighan teaches a course at Princeton each year, "Computers in Our World," intended for computer users who are not Computer Science majors. He wrote this book to bring much of the information contained in that course to the world at large.
Kernighan starts with an exploration of the technology itself. The title of Chapter 1 is, "What is a Computer?" Covering the CPU and how it works, he describes various forms of storage, including RAM, cache, disk, and other types of secondary storage, and how they all work together. After this overview of the hardware, he describes algorithms, how they are used to solve problems, and how they get incorporated into computer programs. In later chapters, Kernighan discusses the internet, the TCP/IP protocols that drive it, and some of the tools used to communicate using the internet.
He looks at the data about ourselves (stored on our computers) that gets transmitted across the internet—with or without our permission. Although there are references to security throughout the book, Kernighan spends a great deal of these latter chapters discussing the many ways in which our data is vulnerable and ways to implement at least some level of protection.
The parts that scared me most were the discussions about how organizations can track our movements on the internet—the effects of this (and tools such as data mining) on our online experiences. I am familiar with using tools like firewalls and strategies such as using good passwords and deleting or deactivating programs and daemons that I am not using. But the ease with which we can get spied upon (there is no more accurate word for it) is appalling no matter what actions we may take.
My first inclination after reading this book was to send it to the two of my grandkids that I am helping to build gaming computers. This book is a good way for them to learn how computers work at a level they can understand. They can also learn about the pitfalls (beyond those their parents have discussed with them) about how to be safe on the internet. I also suggested to their parents that they read it, too.
It is not all gloom and doom. Far from it. Kernighan manages to scare me while simultaneously ensuring that readers understand how to mitigate the threats he discusses. In the vast majority of his scenarios, I had already implemented many of the protections he covers.
This book has made me think more closely about how I work and play on the internet, the methods I use to protect my home network, and how I use my portable devices. Kernighan's level of paranoia is sufficient to ensure that readers pay attention while reassuring us that we can still use the internet, our computers, and other devices with a reasonable amount of safety so long as we take the appropriate precautions.
No! I am not going to tell you all of that. You'll get no spoilers from me.
Kernighan indicates to readers the sections that may get too technical, and you can skip over them. Still, overall this is a pretty easy read and accessible even for many non-technical readers. This was intentional on the author's part. So even if your technology quotient is fairly low, this book is still understandable. Despite the fact that he wrote the first edition of this book only five years ago, this second edition includes important new material that makes it even more applicable to today's technology and the lightning-fast dissemination of data. I found the new section on artificial intelligence quite enlightening.
I highly recommend this book to anyone who wants to learn more about how computers work and impact privacy and security in the modern world.
Brian W. Kernighan's second edition of Understanding the Digital World is worth a read for computer enthusiasts of any skill level.
Image by:opensource.com
Linux What to read next This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.