Can AI Replace DevOps?

With the rapid advancements in artificial intelligence (AI) technology, many industries are starting to explore the potential benefits of incorporating AI into their operations. In the field of software development and IT operations, the question of whether AI can replace DevOps has been a topic of much debate.

On the one hand, AI has the potential to automate many of the tedious and time-consuming tasks that are typically carried out by DevOps teams. This could include tasks such as managing and monitoring infrastructure, deploying code, and testing and debugging applications. By automating these tasks, AI could potentially free up DevOps teams to focus on more strategic and creative work.

Additionally, AI can be used to optimize and improve the performance of software applications. For example, AI algorithms can analyze data and make predictions about potential failures or bottlenecks, allowing DevOps teams to proactively address these issues before they cause problems.

However, there are also limitations to what AI can currently do in the realm of DevOps. One of the biggest challenges is that AI systems require large amounts of data to learn and make accurate predictions. In the fast-paced world of software development, it can be difficult to collect and organize the necessary data in real-time. Additionally, AI systems can struggle with tasks that require flexibility and creativity, such as troubleshooting complex issues or coming up with innovative solutions to problems.

Overall, while AI has the potential to greatly enhance the work of DevOps teams, it is unlikely to completely replace the need for human expertise in the near future. Instead, the most effective approach will likely be to use AI to augment and support the work of DevOps teams, rather than trying to replace them entirely. By combining the strengths of both humans and AI, organizations can maximize the benefits of both and improve their overall performance.

Proof of History bash implementation

Proof of History ( POH ) is a sequence of computation that when checked can provide a way to verify passage of time between two events in a cryptographic way by using hash.

In this example we are going to use MD5 but that can also be sha256 or any other hash mechanism. the idea would be to have 2 functions one that generate the next hash in line and another one that can validate the hash provided is the correct hash.

gen-proof-of-history() {
    t=$(date +%s-%N)
    a=($(echo -n "$t-$1" | md5sum ))
    echo "$t-$a"
}
check-proof-of-history() {
    a=($(echo -n "$1-$2" | md5sum ))
    echo "$1-$a"
}

The function gen-proof-of-history , get only one parameter and and that is the hash of the previous. the output of this function is the hash construct out of the previous hash and the timestamp.
The function check-proof-of-history, get 2 parameters : the first is the timestamp and the second is the prior hash. the output is the next hash that the previous gen had provided.

Browser load test

This simle load test will run 100 requests from your browser toward a targeted URL. The speed and rate of the test depend on the browser ability to run these requests.

Please use with cautious, we take no responsibility of any use of this tool.

Target URL :


Regex performance test

Use this simple tool to test the performance of your regular expression. the test will run the regex against the given text and provide a number that tells you how fast it could run the regex for 10,000 times.







Fix mixed content

We all know browsers are notifiing clients of none secure connection, they also alert when there is mix content of secure and none secure objects on the same page. we will list here the simplest yet most efective way to fix that issue with and without modifying the pages.

Server side settings :

On the server side you should set to respond with the header that will tell the browser to send all requests over secure connection eg. TLS

Content-Security-Policy: upgrade-insecure-requests

Web page setting :

You can also set the same header to get called from within the HTML page itself by placing a META tag at the header of the pages

<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">

docker-compose simple bash container

Docker-compose is a wonderful tool, I use it all day. I fire up containers attach to them and do stuff. the short manual will show how to run a container and connect to it. I am not using Dockerfile in this example but that may be use as well if needed.

What we need is text editor to edit a simple fle called docker-compose.yaml this file will have all instructions ( minimal ones ) in order to run a basic cntainer orcastraded by docker-compose.

Here is the content of that docker-compose.yaml :

version: "3.7"
services:
  mydeb:
    container_name: mydeb
    image: "debian"
    restart: "no"
    entrypoint: /bin/bash
    tty: true

To start the instance simply fire up the composer

docker-compose up -d

The dommand will create the container, network and required stuff, and start the container. if the image is not preset or if needed , you can also tell compose to pull it from repository.

How to connect to running docker-compose container , by running the exec command, against the refering service we created on our docker-compose.yaml

docker-compose exec mydeb bash

List TLS version and ciphers

On this small script you can get list of all TLS versions and ciphers availble connecting a remore destination . the chalange on that script is that sometimes the number of supported ciphers is great and that consumes time. the main tool used here is openssl along with parallel . I also added timeout and custom port that can be set during the run

The script is build around two loops, one that loop the TLS version , and one that loop the TLS ciphers on each verison. the main command is generating a file that later will be called using parallel command . feel free to copy and modify

#!/bin/bash

TARGET=$1
TARGET_PORT=${2:-443}
TIMEOUT=${3:-2}
LOG="/tmp/TLS-$$.log"
RUN_F="/tmp/TLS-$$.sh"
TLS_V="tls1 tls1_1 tls1_2 tls1_3"

for V in $TLS_V
do
	TLS_CIPHEPS=`openssl ciphers -$V | tr ':' ' '`
	CIPHER_COMAND="cipher"
	[ $V = "tls1_3" ] && CIPHER_COMAND="ciphersuites"
	for CIPHER in $TLS_CIPHEPS
	do
		echo "echo | timeout $TIMEOUT openssl s_client -$V -$CIPHER_COMAND $CIPHER -connect $TARGET:$TARGET_PORT &>/dev/null && echo \"$V $CIPHER\" >>$LOG" >>$RUN_F
	done
done

parallel --gnu -k -j 100 <$RUN_F
cat $LOG | sort
rm -f $RUN_F $LOG

if statement one-liner bash

How to write a one liner if statement in Bash. Although I think code should be clear and easy to follow by anyone who reads it. however, I also love the simplicity and my code as short as possible, if statement that usually tends to be long, can also be one line in Bash

1. Simple do one command if true
 [ $arg -gt 10 ] && echo "$arg bigger is than 10" 

2. Run multiple commands

[  $arg -lt 10 ] && { date; echo "$arg smaller than 10"; }    

3. If true run one command else run another

[  $arg -gt 10 ] && echo Big || echo Small

netcat is awesome

“Good things last long”, my mama use to say.
and just like that netcat is no exception
Using netcat for security testing is fun simple, and you do not need to
install applications you know nothing about .
here are some fun examples from tests I use :
Simple HTTP GET

echo -e "GET / HTTP/1.0\r\nHost:www.example.com\r\n\r\n" | nc 127.0.0.1 80

Simple HTTP flood

while true
do
    echo -e "GET / HTTP/1.0\r\nHost:www.example.com\r\n\r\n" | nc 127.0.0.1 80 &
done

Simple UDP flood

cat /dev/urandom | nc -u 127.0.0.1 53

Simple SYN flood

while true
do
    nc -z 127.0.0.1 80
done

bash singleton process

Here is a simple, yet effective way, to make sure a script runs only once.
when the script start it run a simple line that detect if the number of running script
does not exceeds more then 1.

#!/bin/bash
[[ `pgrep ${0##*/} | wc -l` > 2 ]] && exit 1

This trick may come handy for example when you run a backup in a cron
and you do not want 2 processes running at the same time .