Simple way to secure your root shell just a little bit more


Recommended Posts

Well, I liked it. Say what you want but I like it :p

Simply edit your shell login scripts file (/root/.bashrc) to include:

clear
echo "So you've found out the root password, now what?"
read P

if [ "$P" != "leeter" ]
then
 clear
 echo "I don't think so, goodbye!"
 exit
fi

clear

Or the one-liner:

clear && echo "So you've found out the root password, now what?" && read P; if [ "$P" != "leeter" ]; then clear && echo "I don't think so, goodbye!" && exit; fi; clear

Simply enter 'leeter' or whatever you specify when you get the prompt.

:)

Thing is, if you saw this as you opened a root shell, you'd know that there is a modified bashrc. So, you'd open bashrc and you'd see what the "password" is.

Perhaps it would work better if it didn't say anything along the lines of "So you've found the root password...", but simply left you at a command prompt. You might think you could enter any command, but actually, the shell will close unless you give the correct "password" first. Then again, after a couple of attempts, you might suspect that bashrc had been modified in some way....

Is there no way of obfuscating the password in bashrc?

Edited by Mr Fish
  • 1 month later...

An encrypted volume won't do anything to prevent or stop an attack against a running system. It's not a bad idea, and it is better security-wise than the alternative, but it's more of a physical security consideration.

Regarding the original poster's idea, the reason it doesn't help security is that you're relying on two assumptions: that an attacker who knows your password has to execute (and can't read) your .bashrc file, and that your script can't be terminated before calling exit.

The first assumption is false. There is nothing requiring that the console read/execute your .bashrc (let alone requiring that bash run at all); an attacker could simply specify bash --norc as the program to run if they wanted a shell when they run su or connect through SSH or whatever, bypassing your extra password prompt.

The second assumption is also false; as pyther said, a simple ctrl+c would terminate the execution of your script and present the attacker with a shell.

Security is not a trivial thing to get right. Essentially what you have attempted to do is add a second password; an extension to your existing password if you will, which unfortunately doesn't work very well. You'd be better off just appending that extra string to the end of your actual password.

Edited by David Scaife
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • You can now turn 2D images into 3D objects with Copilot's new feature by Taras Buria Copilot Labs, a section with various experimental features in Microsoft's AI assistant, received a new feature called Copilot 3D. With this feature, Copilot lets you transform 2D images into usable 3D objects. Microsoft wants to simplify the process of creating 3D models, giving users a useful tool that has no steep learning curves and does not require installing complicated software. With Copilot 3D, users can transform images into 3D models for later use in game development, animations, 3D printing, design, AR or VR content, art projects, and more. Additionally, users can browse a library of various 3D objects and scenes for inspiration, which sounds awfully familiar to the now-deceased Paint 3D and its content marketplace. Copilot 3D is a one-click solution. All you need is to upload a picture (PNG or JPG, less than 10MB) and wait for Copilot to do its job. Unfortunately, Copilot cannot generate 3D objects from text prompts, at least for now, as Microsoft says in the announcement article. Copilot 3D is available for free globally to a subset of Copilot users, but you need a Microsoft Account to access it from the browser. Once your object is generated, you can download it in GLB format so that you can later modify it in any compatible 3D viewer, editor, or game engine. Microsoft also adds that all creations are stored for 28 days, and the company does not use the uploaded images for model training or personalization. In case you missed it, Microsoft recently introduced another Copilot Labs feature, which gives the assistant a physical appearance with physical expressions and emotions for a more natural conversation in Voice Mode. Also, Microsoft launched Gaming Copilot in Game Bar so that AI can see what is going on on the screen and give you tips, suggestions, and other useful information.
    • Shou Zi Chew was Xiaomi's CFO for five years and was with them when they were blacklisted by the first Trump administration. Xiaomi was subsequently removed from the blacklist by the Biden administration. Sen. Cotton was also a Senator when Xiaomi was blacklisted though I don't recall if he was influential in bringing about the blacklisting.
    • How'd that 4 year investigation finish from the GOP led house about this? Oops.
  • Recent Achievements

    • One Month Later
      Jaclidio hoy earned a badge
      One Month Later
    • Week One Done
      Yawdee earned a badge
      Week One Done
    • Week One Done
      eugwalker earned a badge
      Week One Done
    • First Post
      Ben Gross earned a badge
      First Post
    • One Month Later
      chiptuning earned a badge
      One Month Later
  • Popular Contributors

    1. 1
      +primortal
      638
    2. 2
      +FloatingFatMan
      174
    3. 3
      ATLien_0
      141
    4. 4
      Xenon
      118
    5. 5
      wakjak
      108
  • Tell a friend

    Love Neowin? Tell a friend!