Sep 20

The Importance of Writing

I have been really bad about writing this year. If you look my last post was back in January, and we are already in September. It has definitely been a busy year for me which has been distracting. Recently I read this article that I thought was important enough to both share and act on.

The article is about writing. You see writing provides many valuable benefits. For me the main ones are competency and self examination. This blog here may seem like a random collection of information pertaining to my thoughts on computer architecture, theory, and helpful tips, but for me its an examination of my personal growth.

One truly understands something if they can translate it in to words, and have another person read it and grow from it. Its the same concept as learning through teaching but cranked up a notch. With this I will start brainstorming some cool topic about some cool technology, and publish a cool article about it that cool people read!

But before I end this article I would like to share what made me get back into writing. The article is called “Everyone Should Write” and it briefly and effectively tells you how to get started doing it.

Everyone Should Write

So with that…. Im Back Baby!!!!!


Jan 05

Cross Site Scripting

Recently a few of my friends had become white hat hackers. This is an interesting predicament for a person like me. I am both a systems designer and developer. My very good friends had more or less become a friendly version of the enemy. Over a couple of beers they recounted tails of how insecure the world is, and how easy it is to prevent. Well I started to think… I develop different websites, and… I am developing the highly anticipated Query Glass. I need to make sure everything is secure if I want for people to continue to trust me, and anything I create.

I set off to learn some hacking to be able to better protect against it. I will readily admit I am not a top hacker by any means, but if I can hack something then it will be easy for any hacker to get in. The first step to learning to hack is to understand the attack, and know how to exploit it yourself. Lately, cross site scripting has been all the rave, so I decided to tackle this exploit first.

The Hack

Cross site scripting (XSS) is surprisingly simple to do. It is scary easy. The idea is to fool the server into executing JavaScript code for you. To do this the hacker will go onto a website, and type something like the below into a typical text field.

” <script> alert(‘Hacked’); </script> ”

It should look something like the below. This hack should work in any unsecured textbox.

Cross Site 1

When the hacker presses enter the browser will send this into the server. Now let us assume that you type this into a search box. In just about any site the search string that you pass is normally written onto the site as a way of allowing the user to see what his last search was. This sounds quite reasonable if you ask me. The secret are the quotes. As anyone that has done HTML development they know that misplaced quotes can cause havoc on the way code is rendered in a browser. Basically, the extra quotes break out the interior and make the code in between be printed onto the final HTML document as regular code. Since it is not encapsulated it will execute on load. Now, the code above is quite benign. The only thing that will happen is a alert box will open that says, “Hacked”. Making an alert box appear means you can execute arbitrary code which is very bad. If you replace “Hacked” with “document.cookie” the alert box will contain all the cookie data. This can include all kinds of personnel user information.

Another devious use of this exploit is to put a header() call between the script tags and redirect the user to another site. Imagine a hacker sends an email out to customers of a certain bank with a link to:


They make up a story about a security breach and mandatory resetting of passwords is taking place. They provide the above link as a quick and easy way to reach the site to reset the password. Now your slightly more aware than the typical consumer will mouse over the link to ensure that it is going to where it says its going. The domain looks legitimate. It is certainly the banks domain (, and there is gibberish at the end of most internet URLs. They will go ahead and click on it.

The user will in fact go to the banks website, but because of the header call hidden in the url which will be written onto the site. The website will execute the hidden code and redirect the user to An email fishing site meant to look exactly like the banks website. The user innocently resets their password, but in reality handed the password over to the crooks. Ladies and gentleman your poor respect for data validation has left a poor old couple without their retirement funds…

Cross Site 2

Preventing the Attack

Now that we know how to conduct the attack, we must now stop it in its tracks. There are a few ways of accomplishing this. You can either filter each URL that goes through the web server and black lists text that looks like cross site scripting attacks. This method of prevention is highly dependent on your web server. I recommend you refer to your web server’s documentation to figure out how to do this method. My preferred method is basic data validation. I normally don’t let my users use any special characters. That simple. I only allow letters, numbers, and whitespace.

Those characters are quite important in the coding world, so by restricting them they cause tons or problems for hackers. To filter out the bad characters I run every input field through the below function.

function valString($str) {
if (!preg_match(“/^[a-zA-Z0-9 ]*$/”,$str)) {
return false;
else {
return true;

If it is clean I allow the page to load. If it find any issues, I return the user back to the page he put his input into and wag my finger at him, or whatever I want to do to the user. One thing is for sure. I don’t allow the page to be loaded with the code.

So that is it. Keep coding and keep it secure.


Status update

I want to wish a happy Thanksgiving from Tech Daemon to you! Enjoy your turkey!


Nov 09

New Top Level Domains

It is here. It is finally here! New top level domains! What does this mean you might say? It means that you are now going to see websites like or I would be curious to see how this will effect the online landscape. This is not the first time that top level domains have been added. Take in mind back when they added country specific domains, like the one I use for this site.

The .me domain is actually based out of Montenegro, and was added to the domain structure back in 2007. Everyone thought it was going to be a hit. It was priced competitively, and a slew of marketing campaigns were launched like the one below trying to convince consumers that the .me domain was going to make the internet personnel.

Well it worked. The .me domain is now a very commonly used domain for blogs and personnel sites like this one. With the appearance of other top level domains it will open up new opportunities, but with so many… I predict that many will be failures.The reason is that .me resonates with what it is. It is quite literally me. Domains like .day and .fire will probably be overlooked. This isn’t to say that there aren’t any great ones like .app which will probably sell through the roof.

Another thing to note that these new domains since they are fresh are being sold at a premium. The domain .luxury for example is being sold for $700, but more importantly this will destabilize the domain market. Most people don’t realize that domains are bought and sold as a commodity. People make a living doing this, finding attractive domain names, buying them low, then selling high. This will make the supply of names grow astronomically. The effects on the market are yet to be seen. Between installing a price floor like the price of .luxury and the increase of supply it is hard to tell where the value of domains will be once the dust has settled. I believe that people are cheap, and the price of domains will probably stay close to the price floor. I also think that some domains which are currently valued very highly will lower in value since decent alternatives will now exist.

Right now only .luxury, .menu, .build, and .uno are available for presale, but there are hundreds more that will be going on sale soon. So if you really want to get in on this, I normally use GoDaddy for all my domain name registration. They are fast, easy, and if you search for coupon codes super cheap. You should never have to pay full price on their site.

So happy coding, and happy registering!



Hello followers. I have found yet another boon of information. The link below has free ebooks in just about every topic in programming you can think of. Unlike my previous ebook links these are over all kinds of topics, not just Microsoft. For my sharp eyed readers you will see the Microsoft I have published before made the list. Seriously an unbelievable amount of material that can help you on your next project.


Sep 22

Colosimo Photography is Up and Running

My friend John Colosimo will be featured on the Parana Observatory’s picture of the day for his astrophotography. He is quite the talented photographer, and if you have never seen astrophotography it is quite amazing. The planning and forethought that goes into each of those pictures is quite impressive. I am happy I could help him share his talent with the world by assisting him build his website. I am proud to say that his website is up and operational. I highly recommend that everyone take a look at these amazing pictures.


Status update

I have decided to do something about universal data. I have created a website called Query Glass. QG will start off as a simple database project, but there will also be a data aggregation server design thrown into the mix. That is the special sauce so to speak.

So keep an eye on development. I will include a link in the projects tab on this site to make it easy to find.



Aug 11

A Shift for Universal Data

In today’s technological climate its a shame to see how decentralized we are. Every group is trying to take a new spin on things, but accomplish very little. Granted this is probably the most connected society the universe has known, but it could be so much more. Even with all these connections, the information technology world is fragmented. How many social sites must one keep up with. How many email accounts do you have with how many different services.

Society needs to start looking at communal data that can easily be used for data acquisition. The challenge to this is scale. One company working to make centralized communal data available to the public is difficult. The US government has created which is a federal database full of great data sets. There are also many fantastic privately run databases such as The Movie Database. I would love to see more consistency in the way the data is acquired, but this would be a monumental effort. Maybe over time you will see these efforts come together.

The data in the Ethernet (the cloud before marketing got their grubby hands on it) is already there. The only problem is that the way we access that data is fragmented. To get social data we use Facebook. For the weather we go to The second problem is that our ability to collect data in our lives is still in its infancy. The third problem there is now real way to combine all that data, and make smart adjustments to our lives.

That is why I propose a shift in the American way of life. Where home infrastructure become the norm, and when I say a infrastructure I mean an old computer hooked up in the corner will almost certainly work. The idea is that this computer should act as a sever that is a collection point for data in the home. Essentially, I am proposing a home SCADA system. With small endpoints collecting data in the house hold, that data can then be cross referenced with public stores of data. Then you can have you home system automate choices for you, such as changing the thermostat to an optimal temperature or deciding a more optimal watering schedule for your lawn. Also the nice thing about puling data in from to cloud to be cross referenced locally is that it ensure privacy. There would be no company trying to sell you anything extra, or even sell you like a product.  This would both enhance your life and keep it private at the same time.

Universal DataTo accomplish this, there needs to be a paradigm shift. That and the home infrastructure needs to be accessible. Easy to use. Easy to set up. So here I go, off to try and accomplish this feat. Will I succeed? Perhaps. At least Ill have fun trying.




Everyone loves free. Recently Adobe has released their entire CS2 line of products. These 2005 suite of products are hands down the best in class for its era. Now they are seven years old, but they are still quite powerful tools by todays standards.

Happy publishing!!


Dec 26

Home Theatre System: The Design

I recently stumbled upon a new hobby that has been quite an adventure for me. One that requires both technical skills, as well as, creativity to solve problems. You see I have a fascination with system integration, and enjoy finding ways to make systems where you can not tell there is any integration at all. I hope to make a series of articles describing my adventure that will hopefully create a path that someone could use to help them along the path.

The Requirements

The first step I had to take was to jot down the requirements. The first thing I had to think about is what I wanted to accomplish with this system, and what kind of functionality I wanted to get out of the system. Then I looked at what resources I had lying around the house. I have three TVs lying around the house. One TV sits in the bedroom, living room and the kitchen. I knew that in the living room I wanted the full experience. I want it to play live tv, PVR, streamed internet tv, Blu-ray,  and some gaming. It also has to support HDMI connections and be ready for a 7.1 surrond sound. For the kitchen and bedroom I would be happy with just live tv and streaming. I also wanted to make sure everything look cool and was consistent across different TVs. I also want to make sure that all the systems work together seamlessly. That when I change programs, I don’t need to change input device. I want to reuse as much as I can. I have an old computer I built back  in middle school which has a dated hardware, and couple of laptops. So I figured I would need to pull a lot of different elements from different sources.


Requirements of the System

  • Live TV
  • Personal video recording
  • Ability to stream from Amazon Instant Video, Hulu, and Netflix
  • Playing local files (movies, music, images, etc…)
  • Ability to play Blu-ray
  • Gaming
  • Visually appealing
  • Synchronization across many TVs and sound systems
  • Needs to be intuitive with multiple input devices
  • The system should minimize the need to change input devices
  • Gaming should ideally be done with a controller
  • Has to be high-definition and surround sound capable

The Design

Now that I know what I wanted and what I had, It is now time to figure out how I wanted to run all of this. In my early experiments I used the laptop to get a feel for all the software. It was quite convenient since they are quite portable and have built-in wireless, but it wouldn’t accomplish what i had in mind. I decided I would have to build my own HTPC for the living room. Since this is not a full on gaming rig, and I am only planning on playing games that play well with a controller, it only needs to be a solid medium range build. I have also chosen to use Windows 7 as the operating system. This decision is influenced by my desire for gaming and easy Blu-ray.

Now the kitchen and bedroom I would need very little hardware and to have a low profile. Now recently, there has been a release of these Raspberry PIs. These are small computers that cost approximately $50-$60 when all the additions are purchased, and are quite capable to run the software and the videos that I want in the bedroom and the kitchen. On this system I was planning to run the Raspberry Pi flavor of XBMC running on Ubuntu. It is cheaper, easier to remotely manage, and keeps the cost down on these slim featured nodes. Fits the requirements and keeps costs low.

Now there is one final component. A server that helps synchronize everything. The servers duty is to combine all the sources of media and present them to the end point nodes and keep all the information. This server will be expected to be running constantly. Therefore, it is necessary that the server be designed with power consumption in mind. The operating system I have in mind is again Linux because it plays friendlier with all the different operating systems involved. It also keeps the overall cost down and is easier to manage via a secure shell connection.

In terms of software I like to use XBMC. It is quite versatile. It can play all kinds of file based and streamed movies and tv shows. The newest release can play live tv and PVR. Finally, it can run on just about any operating system and platform. There are others, but I like the degree of control and agnostic it is.


So this design centralizes everything in a server then gives you two flavors of hardware to allow you to control costs vs. functionality depending on the expected role. Future articles will go into further detail about each component, and how I built it.


Older posts «