The “Internet of things” (IoT) is a concept, an increasingly growing topic of conversation, with a major potential to impact how we work and live.

Everything is connected around us. This revolution has already started and it will be bigger than previous technology revolutions, including the mobile smartphone revolution. Internet of Things, as many call it today, will fundamentally affect all of us.

— Ari Jaaksi – Mozilla Senior Vice President, Connected Devices

   In December of last year, Mozilla decided to shift focus to Connected Devices and these days Mozilla officially confirmed that the arrival of Firefox 2.6 in May will mark the last release of Firefox OS supported on smartphones. Meanwhile, Boot to Gecko, which was what Firefox OS was rebranded from, continues to live on as an open source operating system.

 

It’s time for new challenges!

Mozilla will concentrate at the beginning, on four Internet of Things projects. These projects have a clear focus on user rights and privacy.

Link (also known as Foxlink)

A personal user agent that is under the full control of the user. Basically, it helps you interact with your connected devices and may automate certain tasks for you.

Project Link aims to be your own, personal user agent for the smart home, creating a web of things that is completely yours. Instead of entrusting your data to a third party, your Link agent understands your preferences for how you want to interact with the world of devices in your home, and can even automate your connected world for you. All of this still done conveniently and securely, but completely under your control.

SensorWeb

SensorWeb wants to advance Mozilla’s mission to promote the open web when it evolves to the physical world. It aims to find the easiest path from sensors to open data so contributors can collaboratively use sensors to get great detail of understanding their living environment.

Smart Home

Project Smart Home offers a middle ground between “in a box” solutions like Apple Homekit and DIY solutions like Raspberry Pi. Combining modular, affordable hardware with easy-to-use rules, Smart Home empowers people to solve unique everyday problems in new and creative ways.

Vaani

Mozilla wants to create an open voice interface that developers, device makers and end users can utilize.

Vaani aims to bring a voice to the Internet of Things (IoT) using open, Mozilla-backed technologies. We believe a voice interface is the most natural way to interact with connected devices, but currently, there are no open solutions available at scale. With Vaani, we plan to offer an “IoT enabler package” to developers, device makers, and end users who want to add a voice interface to their devices in a flexible and customizable way, while avoiding the need to “lock-in” with one of the major commercial players.

 

   Mozilla is looking for contributors for all these projects and says those interested can contact their connected devices participation team for more information.

The next version of Ubuntu is coming soon I’ve successfully upgraded to Ubuntu 11.04 which is still in Beta, so it is not meant for everyday use, just for testing and finding out bugs … As is to be expected, at this early stage of the release process, there are known bugs that users may run into with the new unreleased image. The thing is that after upgrading I don’t see any menu at all …  All I can see is a screen which has the contents of the desktop folder … So I did the following:

–> Right click on the desktop -> Create launcher

….enter a ‘name’ and ‘gnome-terminal‘ for ‘command’

This way we have an icon that gives us access to a terminal…

–> I ran ‘gnome-panel‘ from CLI (it works okay with ‘unity‘ )

If you do not have ubuntu desktop installed (for some reason) you can install it using:

[pensacola@pensacola-tech ~]# sudo apt-get install unity

or

[pensacola@pensacola-tech ~]# sudo apt-get install ubuntu-desktop

After that I was able to access main menu, applications and so on…

Except Graphics and Display issues there are also some issues with Networking & WiFi – wireless connection fails when switching to a virtual console. (656757).

The stable release comes out at the end of this month! :-)

27. March 2011 · 1 comment · Categories: Linux

Apt-Get is a great command line tool for downloading apps in Linux, but lately I’ve tried apt-fast …Apt-fast is a simple command line utility that can make installation and upgrading of softwares in Ubuntu/Debian much faster.

Apt-Fast uses the Axel download accelerator to download different pieces of a package simultaneously, lowering the total time it takes to download a package (this is all about HTTP/FTP downloads). To set it up, download the shell script, put it in your home folder, and run the following commands:

[pensacola@pensacola-tech ~]# sudo apt-get install axel

[pensacola@pensacola-tech ~]#sudo mv apt-fast.sh /usr/bin/apt-fast

[pensacola@pensacola-tech ~]#sudo chmod +x /usr/bin/apt-fast

You can now install packages from terminal using the “apt-fast” argument.

E.g: [pensacola@pensacola-tech ~]#apt-fast install gftp

Note that this doesn’t speed up your actual internet connection, just downloads from Synaptic or the Ubuntu Software Center!

… some rights reserved to …

Cloud computing provides a way to develop applications in a virtual environment, where computing capacity, bandwidth, storage, security and reliability aren’t issues (you don’t need to install the software on your own system).

Open source is important in all aspects of cloud computing. It is used to build the core of the “cloud” and its services. Linux is the operating system of choice for both physical and virtual machines in the cloud. This may not be the year of the Linux desktop, but it’s definitely the year of Linux powering cloud computing!!!

Cloud computing consists of computing, networking, and storage resources used to power services, as well as combinations or mashups of services, that previously had been expensive, impractical and even impossible to provide.

There are three types of cloud computing options available today:

  • Virtual infrastructure provisioning
  • Application development and delivery
  • Building your own cloud from scratch, using your own storage, processing, and networking resources

nfrastructure provisioning is the most flexible option because it provides pure computing resources such as CPU, bandwidth and storage. A good example of such a service is Amazon’s Elastic Compute Cloud (EC2). The user has complete control of these resources and what they do with it. More info here: http://cloudcomputing.info/en/news/2010/07/amazon-allows-vms-custom-linux-kernel-on-ec2.html

23. November 2010 · 1 comment · Categories: Linux

SSH on multiple servers using Cluster SSH

If you had to make the same change on more than one Linux/Unix server, e.g a backup, restore or something else, well ClusterSSH is a tool that solve this problem. It’s painful to keep repeating the exact same commands again and again… so Cluster SSH opens terminal windows with connections to specified hosts and an administration console. Any text typed into the administration console is replicated to all other connected and active windows. This tool is designed for cluster administration where the same configuration or commands must be run on each node within the cluster, in this way all nodes are kept in sync.

It’s easy to install it … on Ubuntu you just have to type:

[pensacola@pensacola-tech ~]# apt-get install clusterssh

To configure this you have to edit /etc/clusters file. Contains a list of tags and hostnames, in the form:

<tag> [<username>@]hostname [...]

Basically if you want to perform the same command on the three servers: server1, server2, and server3 you have to use the following command:

[pensacola@pensacola-tech ~]# cssh server1 server2 server3

This will open three consoles, one for each server, over an ssh connection, and one little console to type your command.


20. August 2010 · 9 comments · Categories: Linux
"Unable to connect to database: Too many connections" - common issue ...

Cause:
This error means that the limit of simultaneous MySQL connection to mysql server is
reached and new connections to the server cannot be established at this time.

Resolution:
There are two ways to solve this issue. The first one is increase the connection
limit and the second, find what is the reason of “too many connection” error and
try to lower MySQL server usage.

MySQL server state can be checked using ‘mysqladmin’ utility. For example to find out
the number of current connections to the server use:

#mysqladmin -uadmin -p extended-status | grep Max_used_connections
| Max_used_connections | 11 |

Current connections limit settings can be found with:

#mysqladmin -uadmin -p variables | grep 'max.*connections'
| max_connections | 100 |
| max_user_connections | 0

In the example above, maximum number of connections to the server (max_connections)
is set to 100. And maximum number of connections per user (max_user_connections)
to zero, that means unlimited.
There are default MySQL values. They can be redefined in /etc/my.cnf, for example:

[mysqld]
set-variable=max_connections=150
set-variable=max_user_connections=20

Restart MySQL after my.cnf is modified.

Note, if you set connections limit to very high value (more than 300) it may
affect the server performance. It is better to find out the reason of the high
MySQL server usage.
You may check what users/requests slow mysql and take all curent connections,
for example with the command:

# mysqladmin -uadmin -p processlist
In order to do this we have to create a file named .ftpaccess with the
following content:

 <Limit ALL> DenyALL
 Allow 127.0.0.1
 Allow IP(s)  ( if you have dynamic IP you have to set IP class like this:

Allow 1.2.3.
Allow 1.2.4.
Allow 1.2.5.
Allow 1.2.6.
Allow 1.2.7. )

 </Limit>

This file must be uploaded in httpdocs, httpsdocs, cgi and web users ...
That's it!

Here is a good website to check if your IP is on the blacklist …

09. July 2010 · 2 comments · Categories: Linux

A quick and useful command for checking if a server is under ddos:


netstat -anp |grep ‘tcp\|udp’ | awk ‘{print $5}’ | cut -d: -f1 | sort | uniq -c | sort -n


That will list the IPs taking the most amounts of connections to a server. It is important to remember that ddos is becoming more sophisticated and they are using fewer connections with more attacking ips. If this is the case you will still get low number of connections even while you are under a DDOS.

Another very important thing to look at is how many active connections your server is currently processing.


netstat -n | grep :80 |wc -l

netstat -n | grep :80 | grep SYN |wc -l


The first command will show the number of active connections that are open to your server. Many of the attacks typically seen work by starting a connection to the server and then not sending any reply making the server wait for it to time out. The number of active connections from the first command is going to vary widely but if you are much above 500 you are probably having problems. If the second command is over 100 you are having trouble with a syn attack.

To Block a certain IP address that on server .Please use following commands:


route add ipaddress reject


… for example: route add 192.168.0.168 reject

You can check whether given IP is blocked on server by using following command


route -n |grep IPaddress


Or use follwoing command to block a ip with iptables on server


iptables -A INPUT 1 -s IPADRESS -j DROP/REJECT

service iptables restart

service iptables save


Then KILL all httpd connection and restarted httpd service by using following command:


killall -KILL httpd

service httpd startssl


A simple command such as netstat -n -p|grep SYN_REC | wc -l would list all the active SYN_REC connections on the server… depending on the server’s size, 30 to 40 SYN_REC could be a sign of a DDOS attack.

Again, do not be fixed on numbers, different variant play when deciding to ring the DDOS emergency bell


netstat -n -p | grep SYN_REC | awk ‘{print $5}’ | awk -F: ‘{print $1}’ will therefore list all the IPs that are maintaining the SYN_REC connections.

A quick and useful command for checking if a server is under ddos:

netstat -anp |grep ‘tcp\|udp’ | awk ‘{print $5}’ | cut -d: -f1 | sort | uniq -c | sort -n

That will list the IPs taking the most amounts of connections to a server. It is important to remember that ddos is becoming more sophisticated and they are using fewer connections with more attacking ips. If this is the case you will still get low number of connections even while you are under a DDOS.

Another very important thing to look at is how many active connections your server is currently processing.

netstat -n | grep :80 |wc -l

netstat -n | grep :80 | grep SYN |wc -l

The first command will show the number of active connections that are open to your server. Many of the attacks typically seen work by starting a connection to the server and then not sending any reply making the server wait for it to time out. The number of active connections from the first command is going to vary widely but if you are much above 500 you are probably having problems. If the second command is over 100 you are having trouble with a syn attack.

To Block a certain IP address that on server .Please use following commands:

route add ipaddress reject

… for example: route add 192.168.0.168 reject

You can check whether given IP is blocked on server by using following command

route -n |grep IPaddress

Or use follwoing command to block a ip with iptables on server

iptables -A INPUT 1 -s IPADRESS -j DROP/REJECT

service iptables restart

service iptables save

Then KILL all httpd connection and restarted httpd service by using following command

killall -KILL httpd

service httpd startssl

A simple command such as netstat -n -p|grep SYN_REC | wc -l would list all the active SYN_REC connections on the server… depending on the server’s size, 30 to 40 SYN_REC could be a sign of a DDOS attack.

Again, do not be fixed on numbers, different variant play when deciding to ring the DDOS emergency bell

netstat -n -p | grep SYN_REC | awk ‘{print $5}’ | awk -F: ‘{print $1}’ will therefore list all the IPs that are maintaining the SYN_REC connections.

css.php