Tuesday, May 18, 2010

Facebook and Other Social Netwokring on Communication Skills

     As technology has continued to improve at an exponential rate, the internet has become a tool that no one can go without. Much of this increase in time spent online can be accounted to social networking sites such as MySpace and Facebook. These sites have become so trafficked that they have had a profound impact on the communication skills of our nation's youth. Teens who use these websites extensively lose their ability to communicate at the same level as teens of past generations.
     It was only four years ago that Facebook finally decided to completely open its doors to everyone who was in, or has passed, high school, where previously there had been a more strict age requirement. With this change, the status symbol of pre-teens graduating from middle school and entering high school became their Facebook accounts. After four short years, of completely opening up their system, Facebook has become the most visited website, falling short only of Google. After "an increase of a whopping 82% over [the last year]," (Scott) Facebook users in the US now log over 1 billion hours on the site per year. Such a dramatic increase is not solely a product of kids taking time away from homework, the use of Facebook has branched out to now interfere with time that could be spent in face to face communication.
     Many proponents of the online social networking websites would suggest that because of the increased time spent on these websites, and subsequently the increased amount of communication, the community has overall improved its skills and quality of communication. Referencing the line "practice makes perfect," they state that their increased communication has given them more time to develop their skills and are therefore better at communication. This is a blatantly incorrect analysis. Though the quote used is a perfect analogy to the situation at hand, the group of extensive users is in no way increasing their communication skills. The quality of conversation of these websites is so diminished that the extended amount of time spent is in fact retracting from the skills of these kids. This is parallel to a soccer play sitting on a park bench, kicking a ball between his feet. Sure he is practicing soccer, in the sense that he is moving the ball with his feet. However, no sensible person would argue that he is better off continuing his current drill, rather than playing in a game to improve his skills.
     This past argument is agreed upon by everyone. The confrontation really lies in the use of "diminished" in the past statement. Is the communication going on online really of a lesser quality and necessitating a lesser skill set then offline communication? This is what is at the heart of the debate.
     In defining communication, there are a few factors that need consideration. First, one needs to analyze body language. How a person carries themselves during conversation. Second is the use of correct grammar and proper diction. Whether or not the speaker chooses words that correctly exemplify the meaning they want to get across, and whether or not grammatical rules are followed. The third factor going into communication is the speed of the conversation. When in a discussion or talk, participants only have at most one second to respond to what was previously said with a well thought out response. The final facet of good communication, and arguably one of the most difficult parts to master, is the ability to carry a conversation, and to cultivate meaningful topics of discussion. These are the elements that go into determining the quality of communication. As teenagers spend more of their time behind the computer screen instead of face to face with their peers, all of these skills will be negatively impacted.
     When talking in person, all participants must show proper body language. This includes looking others in the eyes, sitting up straight, not getting distracted, and maintaining positive and engaged facial expressions. These are the types of things that parents reprimand children for when they are younger, and it is during teenage years, that these skills are solidified. This is a clear cut example of how Facebook negatively affects these teenagers. Facebook's wall posts, and IM client do not require any sort of visual interaction between any of the participants. This gives all participants free reign to look and act however they please, and to never practice any of the proper body language techniques. As a result, kids now are much less adept at conversation mannerisms which to parents who were taught these mannerisms as children, see it as a sign of disrespect rather than a lack of knowledge on the part of the teens. This has caused a social gap to appear between those who have had access to these new technologies as teenagers, and those who have not.
     People who disagree with this analysis of the situation at hand argue that although it is true that teens are less adept at face to face conversational mannerisms, it no longer matters. As life is driven into the twentieth century, the new explosion of technology and 24/7 connectivity to the rest of the world has given everyone more time to communicate behind the screen of their respective technological devices than would be possible with solely in person interaction. Now that we have removed the necessity for face to face interactions, there is no reason why body language should remain an intrinsic value in our society.
     Although it is true that these devices have removed most of the necessity for in person interactions, this is no reason to completely disregard them, and to subsequently ignore body language in these types of meetings. Humans need social interactions to survive, and many will go insane without it. A completely online lifestyle does not fulfill this need of human interaction for most people. Another point to the contrary of the opposing sides statement is that there is still a great deal of face to face interactions that go on daily, that currently could not be replaced with technological devices. Therefore we cannot replace or disregard the value of body language in our culture.
     A seemingly less pressing issue than the previous example, diction and grammar have always been a shortcoming of the teenage population. However, there is still an undoubted decrease in correct word choice and language usage. Most people type less than half as fast as they can talk, and now that many teens do the bulk of their communication online, one would ponder how they make up for all of the lost speed of conversation. Kids of this generation have attempted to make up for this lost time by cutting corners in the grammatical correctness of their conversations. They have utilized newer, more creative ways of shortening words, phrases, and sentences. With the ever-growing dictionary of abbreviations, the teenage population is slowly degrading communication all throughout America. Not only is has the teenage community ostracized itself by coming up with such slang, but still within the general teenage population, there are more condensed groups and cliques that use and even more specific set of lingo dedicated to their group. This is yet another case where "practice makes perfect" comes into play, but here, sadly, the result is anything but perfect. These teens who's original intent was to shorten the amount of typing necessary, as to encourage speedier online conversations, ended up using their own terminology so much that it has become engrained in their brains and now comes out repeatedly in face to face conversations. It is this usage in daily in person discussions that proves the negative effect Facebook and other such websites have on grammar and diction.
     On the opposite side of the argument is the idea that this radical change in diction and grammar should not be looked upon as a disrespect for the English language, but rather, as a step in the evolution of our language into a newer form of communication. These people make claims that it is to the likeness of looking back on dialogues from the 1600s and seeing how much communication has changed today. Language has always evolved, it is a part of the natural way of life, and those for this argument agree that this is only another small step towards yet another drastic change in modern language.
     I will concede that you can relate this change in modern day language to that of the change from Shakespearean times English to the present day language. However, there exists a large difference in these otherwise parallel comparisons. The vast changes from the 1600s took place over a span of four hundred years, while this current language revolution has come upon us in less than two decades! Evolution, whether it be of language or of living organisms, has been shown to take place over vast periods of time, greater than any humans lifespan. So the claims made about this change in language being another step in evolution are purely fallacy. These changes in teenage communication are no more the product of language evolution then they are of nuclear radiation to the brains of teens.
     In an interview with Mark Wang, during his dissection of online social networking's effects on present day communication skills in teenagers, he stated that these websites were causing a great "lack of face to face conversation" which was giving these teens and unfair disadvantage against those in previous generations. He specifically cited that this lack of real world communication led to an "unrealistic amount of time to think for a conversation." This is a very valid observation. Online, it is not expected that teenagers reply in a conversation very fast. Many will take great deals of time, crafting their responses ever so carefully, to send just the meaning they want. This buffer time, available to revise and edit all of your dialogs before they are sent, is not available in face to face conversations. Once you say a word, you cannot magically erase it from being heard. And likewise, you don't have this open ended amount of time to think of the best way to phrase something. This intense change from online to offline communication has slowly worn away at teenagers ability to respond in a timely manner in real life conversations. Instead, taking more of their time to think of a fairly simple response that should have been recited on impulse.
     Opponents to this idea will argue that this is not a valid point because conversations online are of a completely different nature than those that go on offline. Conversations online largely revolve around gossip and other such topics, while offline there are much more in depth conversations.
     This alone is a separate topic of discussion, however, this claim is still false. Habits are formed by repetitive actions, and it is undeniable that these teenagers are repeatedly taking their time to respond to their friends and families electronic messages. Whether or not the type of communication is different, the brain is associating conversations with this atypically large amount of time in which it has to think of an answer. It is this that carries over into face to face conversations. Another flaw in their counter-argument is that if offline conversations were mostly more in depth, and online conversations were just about gossip, then it theoretically would be easier to be involved in a conversation online than offline. Therefore teens should not have a problem with keeping up with offline conversations, however, this is not true.
     On the separate issue of what is being conversed about, there is the question of what the vast majority of conversations online revolve around, and whether or not that has impacted teenager's ability to come up with meaningful topics conversations. Most of online communication that goes through social networking websites such as Facebook and MySpace can be generalized to forms of gossip. Just the spreading of information about who did what in school today, and other such topics. This in the writing community as well as in conversation, is not looked upon as meaningful topics of conversation. Topics that are given merit in conversation and writing are new advancements in technology, world politics, and other more cultured areas of discussion. Now that teens are spending such a great deal of time online, there are getting less and less exposure to these meaningful topics of discussion. Which translates to an inability to continue a conversation when talking about non-gossip in real life situations. Teenagers are finding it progressively more difficult to have full out conversations with peers and adults alike, that do not touch on gossip.
     A completely separate facet to this entire debate then the act of communication itself, is the idea that multitasking is encouraged by social networking websites and has a proven negative effect on one's ability to perform seemingly simple tasks at the same level of quality as if they had not been multitasking. Facebook has one of the most highly used internet messaging systems in the world. Whenever you are signed on to Facebook, you can immediately see which of your friends are online, and which are not. Those that are online, you can chat to in real time. This chat system differs from real life by allowing you to open up as many conversations as you want to concurrently. And you can do all of this while still browsing the rest of Facebook. The ability to run all of these tasks at once from one website has led to intense multitasking in the everyday lives of teenagers. The idea that multitasking is not good for your ability to accomplish tasks is not a revolutionary concept, it has been around for decades, however, this new online revolution is pushing the envelope to never before seen limits. Kids are now trying to focus on three conversations, looking at pictures of their friends, watching TV, and all while trying to complete their homework assignments. These new levels of multitasking encouraged by these online social platforms has had detrimental effects to teenagers ability to efficiently focus on one task, and to complete any task with as high a quality job as if they had not been prone to multitasking.
     Those on the opposing side of this argument do concede that multitasking is encouraged by these social platforms, and that multitasking has proven detriments to it. However, they argue that these deficits are vastly covered by the platforms ability to connect you with people whom you wouldn't have ever been able to connect with before. This added layer of connect-ability has essentially re-written the six degrees of separation rule. Now everyone, in fact, is one Google, Facebook, or Twitter search away from any other person in the world. This new and unusual ability to be connected to any person in the world is a more than worthy trade off for increased multitasking, and secondarily, lowered performance on everyday activities.
     This is a incorrect evaluation of the capabilities that online social networks have provided. Worldwide connect ability and multitasking do not necessarily depend on each other, nor are they mutually exclusive. They can both exist in their more useful state at the same time, which would be the best case scenario. Giving people the ability to connect with anyone on Earth at just the click of a button, while not distracting people and making it harder to perform quality work because of the burdens of multitasking.
     The positives of online social networking are indisputable. It provides a tool for massive social interaction between many people that you would otherwise not know of or be friendly with. It plays host to some of the most helpful collaboration and planning tools available. It also is great of its organization and data storage. However, with its positives, come many negative repercussions in the way of reduced communication abilities. People using online social networking lose some of their good body language, proper diction and grammar, ability to respond in a timely manner, and some of their ability to come up with meaningful topics of conversation. Teens who use these websites extensively lose their ability to communicate at the same level as teens of past generations.

Citations:
  1. Scott, Greg. "Survey Says: Global Time Spent on Social Media Surges." Web log post. Drop Ship Access. 25 Mar. 2010. Web. 5 Apr. 2010. .
  2. Wang, Mark. "Social Networking Impact." E-mail interview. 7 May 2010.

Saturday, May 8, 2010

Google Code Jam

Google Code Jam is upon us, the Qualification Round has begun, with roughly 16 hours left to complete at least one of the three questions completely.  There are at least 4000 contestants who have made it passed the qualification round so far.  I'd predict there to be between 6000-8000 people who make it through the qualification round by the end of the 24 hours.

The three questions this year are of varying degrees of difficulty.

The first, "Snapper Chain" deals with trying to figure out if a certain light will be turned ON after a certain number of snaps (working just like a clapper), the catch is that there can be an infinitely long chain of these snappers, so the 2nd one would only turn ON, if the first one is already supplying it power.  Many people can see pretty quickly that this breaks down mathematically to a very simple power of two equation.

The second question "Fair Warning" is, in my opinion, the hardest of all three questions posed.  For this you, you are given a certain number of events that occured in the past, and the number of seconds ago that they occurred.  The task is to find  a singularity point for all of these events at the current time or at X seconds in the future.  The singularity is the point in time where all the events time in the past is factor able by the constant T.   The catch is that you want to find X based on the maximum value of T.
As this is the hardest and took me a few minutes to conceptualize, let me offer a tip on how to think about this problem and where to get started.
Lets say I give you two numbers   100 and 200.   If I asked you if 150 could possibly be a factor of both of those numbers you would say no.  The reason you would know that is because the difference between those two numbers is less than the factor I gave you...
This is the only hint I will give.  I'd give too much of the answer away if I say more.

The third question "Theme Park" is easy for the small set, but tricky for the larger one.  I actually messed up on the larger one because I couldn't finish altering my code in the 8 minute time segment you have to submit the large answer set.   This problem deals with roller coasters.  You are given the capacity of a roller coaster, and the number of times it will run in a day.  You are also given groups of people who are together in line (i.e. [5 6 2 3 1]).  Each person pays $1 for every ride, and people get back in line as soon as they get off the coaster.  You have to figure out the coasters revenue for the day.  The caveat is that groups wont split up (they stick together like good friends =)  There are a few other tricks you'll have to figure out yourself.
Where I went wrong on the large set, is not building in checks for the data.  Google gives you completely valid data sets, but they don't just follow the general template like the small set does.  I can't tell more, otherwise I will again give away more than I should, but just think of how they could mess with you and your data.

Good Luck to All Competitors,
See You In Round One!
 - bsquared

Saturday, April 24, 2010

Javascript/PHP Library

After coding a few websites, I have learned jQuery is amazing, and that PHP rules.  But have also not been able to find any librarys that inherently integrate both JavaScript and PHP, or any that just provide simple functionality that is common on websites.

Personally, I don't care that much about all the other fun stuff that jQuery does, such as all of the awesome selectors, animations, and everything else. But what I am in love with is the .post(), method, which makes things so simple.  What I do care about is AJAXing everything... the login, commenting form, rating systems... anything that can be AJAX should be IMO.

Another JavaScript plugin I love to use is Google Analytics.  They track everything for you, and yes, I do mean everything...  If it can be recorded using JavaScript, Google saves it for you.   But the problem with this library, as well as jQuery and all the other libraries is that they take TIME, which is sooo important in web development.  Not only do they take time because of their huge footprints, but they also take time because of all the DNS queries, jQuery hosted somewhere (hopefully on your server), Analytics on Google's server... all of these take time to load, and it adds up.

As evident in the writing above, I think that there needs to be a JavaScript library that focuses mainly on load time for the website, and also has great tools to help developers do simple tasks such as user authentication and event logging.  I'm not gonna just sit here and say what this library should have and pass it off to another developer, so I am announcing here the launch of my JavaScript/PHP library currently entitled "vi".  This should be in an beta release in about a month hopefully.  Keep in mind that this will be written in PHP and JavaScript to optimize load times and such.

Another focus of this library will be to make it as customizable as possible, and also to make it easy to extend with plugins. I have had horrible times trying integrate forum software into an existing user authentication system, and editing its code to change the features for the specified use of that forum.   So one of the initial plugins that I will write will be a extremely easy customizable forum that will integrate with what you already have in place.

Monday, April 19, 2010

I May Just Be Kuler Then You

I say so because I just learned about an awesome website that is an awesome tool for designers (such as myself).  Kuler is a service run by Adobe where users can upload color swatches in groups of 5 that they think work well together.  Then, people can go online to Kuler take a look at the most popular, and highest rated color combinations.  Earlier today I went online to look for a color group for a website I was designing.  The site deals with the Green movement, so I was looking for earthy colors that I could use.  I looked around and one of the most popular ones was called "Park Avenue Shift"  I took a screenshot, opened up GIMP loaded the 5 colors into the saved color spots in the color picker, and went to work.  After 5 or so hours I came up with a nice looking design.  After looking at the design, I think I may need to change the colors up a bit, but that will be as easy as going back to Kuler and looking for another green-centric swatch.  

Hopefully you find good use out of this website, and if you find a particularly good looking color combo, go ahead and upload it for the good of other designers!

For those who want to see the rough design, here it is:

                              (Click to enlarge)

 I still have to play around and find a font that I like, and have to make the logo look better.  Lemme know what you think if you have comments.

Friday, April 16, 2010

A Bit About SSH

So yesterday I made a very detailed post about using SSH to get around networks and use torrents anywhere you want.   I thought about it more last night, and realized I should elaborate a bit more on WHY it works.  Basically, when you torrent stuff, you connect to a tracker, just a regular website, and say 'Hey, I wanna download THIS file, who has it?' and the tracker says 'Here is the list of everyone who does, and everyone who has some of it' (aka seed/peer counts).   Then your torrent client does some stuff to pick out the best seeds for you, and connects to them.  Then you start to download the file.

When IT people look at their traffic, they will suddenly see some traffic popping up on some random port, 61324 lets say.  They can then look at the stream of data, and they can see very easily that you're downloading music or movies or games.  

What we do to fix this is run all of our traffic through SSH.  You might be wondering why they can look at the torrent traffic, but not the SSH traffic... good question.  The answer is that all SSH communication is encrypted.  Yes the IT people could unencrypt it with a lot of work, but they wont.  I will personally guarantee this. SSH traffic is usually expected to be used for remote systems or any other application where you want data to be encrypted.

You may have realized this by now, but SSH is just a tunnel that the IT people can't look into.  It has nothing special to do with torrents.  So you can apply the same settings explained in my last post to ANY application that supports a Socks4 or Socks5 proxy.  

If you have any more specific questions about SSH, Google first, then ask me in the comments if Google fails. haha.

Torrent Without Being Caught

Alright, so there are a few things to discuss here.  First, you have to know  what torrents are.  If not, use Google!  Now that thats out of the way, lets get into the topic. 

Motivations:
So why would we want to hide our torrent traffic??  Well there are a few reasons.
  • Nosey ISPs
  • Nosey School Networks
  • Nosey Bosses
You can see where this trend is going.  So basically we are going to be able to circumvent these problems.
For people who don't quite understand, 99% of all torrent traffic is illegal (which I do not condone), and many universities, companies, and ISPs have built in methods to detect torrent traffic on their network.   

The How-To:
Alright, so we will be using a:
  • Torrent client (I will write this speciffically for uTorrent) 
  • SSH client (PuTTY)
  • SSH server 
This should work for any torrent client, and any SSH client, and any SSH server.  But I'll be writing this with uTorrent and PuTTy in mind.  First we have to aquire a SSH server account.  If you want to use this a lot, I would urge you to purchase an account on a server.  Mine is $5 a month (as low as $4.41 if you commit for more time), and has worked perfectly for the past few months.  http://b2shells.com/   Basically sign up for the Personal Shell.  When you're adding it to you're shopping cart, it will look like this

At the bottom when you where it asks for your Username and Password, that will be your username and password to access the Shell account. Make sure that you write that information down somewhere because we'll have to use that later.  

You don't necessarily have to get a payed account, instead you can look here and get an account on one of these.   If you choose to do this, some of them have a quota, which is how much data you're allowed to transfer over their network.  Usually a very small amount.  Remember that a full CD ranges from 80-120MB usually. 

Once you have your account set up, we need to download PuTTY.  If you're running Windows, I would suggest downloading the installer, but it really doesn't make a difference.  Once you have it on your computer, run putty.  Its just one simple...amazing... window.   

In the "Host Name" type in the host name,  you will be given this if you register for an account somewhere.  Also, be sure to put in the correct "Port" number, sometimes this is different, but it is usually 22.   Then, below, in the "Saved Sessions" type in a name, this is so we can save our session to re-use later.  I usually just type "Torrent" or something like that.  Dont click the Save button yet though... if you did, its okay.  Next, on the left navigation bar, navigate to  Connection > SSH > Tunnels.    In the "Source Port" you choose a number.  I have used 6633 Always, IDK why, but it has worked.  Now there are limitations to the number you can choose, but if you don't want to use 6633 just try to pick something in the 5000 or 6000 range for the sake of making sure it works.  Remember this number, because we'll have to use it later.   Dont type anything into "Destination" and on the radio buttons below Destination, choose "Dynamic" and "Auto"  Then click the Add button.  In the Box  "D6633" should have popped up.  Now go back to the Session tab in the left navigation bar. Now we want to click the Save button. Now you see the session you just saved in the list below.  You can double click on it to open up your session automatically.  Or, you can click on your session, click Load, and then click the Open button at the bottom.   

As a note, you can actually make using this a bit easier if you want to.  In PuTTY, load your session, and in the Host Name box, in front of the web address, type in your username and then @.    So,  if your username is "qwerty", and you are connecting to "shell.sshost.com"  then you would have this in the Host Name box  "qwerty@shell.sshost.com".   Then, instead of having to type in your username every time, you'll just have to type your password.  

Okay, now our shell opens up, it looks like a command prompt window, and you will be asked to log in, so type in your username (if you didn't do the step above) and your password.  Then you will see that you have logged in correctly. To test, type in "pwd"  that should print out what folder you are currently in.    Another not about loggin in... when typing your password, it may look like no characters are showing up.  This is supposed to happen.  Just carefully type in your password, and hit enter.  

If you want to, play around here and use the storage that you get.  Make some folders "mkdir FOLDER_NAME".  Navigate through folders   "cd FOLDER_NAME".  "cd .." gets you to go up one directory.  "vim" brings up the text editor... it is confusing and i will not cover it here.  Maybe in a future post if people want to learn about its amazingness. 

Now we are done setting up SSH, and just have to prepare the torrent client.   Install uTorrent, and open it.  Make sure not torrents are running now, as they will be using the regular network until we change the settings, which could get you in trouble.  

In uTorrent, click Options > Preferences.    In the left navigation bar, click on "Connection"  in the "Proxy Server" box, for "Type" choose "Socks5", for "Proxy"  type in "localhost" ...  "127.0.0.1" will also work. its your choice. For "Port"   we need to put the port we chose to use in PuTTY... I choose 6633 so I would type in 6633.  Pretty simple.  Uncheck "Authenticaton",  and check "Resolve hostnames through proxy" and "User proxy server for peer-to-peer connections".   Now hit "Apply" and then "OK".  

Now download a torrent and open it inside uTorrent and see if it downloads.  If it does, then you have set up correctly... if not, you're out of luck.

Finishing Remarks:
Hopefully this was an easy to follow guide, and everything is up and running correctly.   Sometimes if you let your SSH session run for a very long time it will dissconnect because of a software error or something.  In this case, just click "OK",  then on the PuTTY window, right click, and select "Restart Session" and it will try to reconnect you.  Usually you'll have to log in again.  

If you have any questions or want more pictures, lemme know in the comments. 

Wednesday, February 3, 2010

College Park

Ah, its nice to be back in College Park.  Now that I'm back into the groove of college I'll get back to more regular posting.  


...Updates
So here are my current projects, well these aren't all concurrent, but it is my list of stuff to get to:

  1. Secret Project 1 (SP1)
  2. SP2
  3. SP3
  4. My Blog  (www.bmbsquared.com)
  5. Java CLI
  6. Spare Our Green (SOG)
  7. SP4
  8. debian Install on Android
  9. SP5
  10. SP6
  11. RC Spy Heli
  12. Assembly (x86)
  13. Python
  14. FL Studio
  15. Personal Point System
    1. Vocab
    2. Physical Look
    3. Personality
    4. Relationships
This is really the reason I'm so inefficient, its easier for me to come up with cool ideas then to actually operate on them.  So the list just keeps on growing and growing.   Sorry for all the secret projects.  They are either a possible large profit margin idea that I haven't patented or started work on... or just something I shouldn't be broadcasting over the internet.  haha.   You can expect that those secret projects will eventually be revealed once I begin work on them.   On top of all that, I also have school work.  

So here is my plan of attack for the next few months.  School comes first =(  I know, it sucks, but I need to pull my GPA up a bit.  After that, I need to work on SOG and finish that by March because we're launching our Beta test in March.   Realistically I should also be able to get a rough implementation of my blog done, finish SP6, and install debian on my phone by that time.  Then we have Spring Break, I'll have to do my taxes, but I can really spend quality time developing my blog, learning Assembly as well as Python, and developing SP1.   And at some point I will have gotten around to playing with FL Studio!  So the new list of projects at that point will be.
  1. SP2
  2. SP3
  3. Java CLI
  4. SP4
  5. SP5
  6. RC Heli
  7. Personal Point System

When I head back to school, the RC Heli and SP5 will take precedence.  I'll also work on SP3.  By the end of May when I'm done with school for the year, I will have to add another Secret Project to my list (I already know what it is, but its off my list for now because it isn't ready to be worked on... if that makes any sense). So my summer will be spent at my job, and in my spare time, working on the Java CLI, SP2 (maybe), and SP4.  Also I'll have to work on my new secret project  (SP7).  

Throughout all this, I'll also be working on keeping up with my Personal Point System to make myself a better person. 

As I said, expect there to be posts on nearly all these SP's once I begin work on them.    Well this was probably a boring post for all who read it, but it was great for me to organize my thoughts.  

Monday, January 18, 2010

Website

After having this blog for what, 2 years now? And writing only 8 posts in the first two years, I've decided that I needed to actually start writing stuff here, not just any stuff, but good stuff, stuffy-stuff, you know what I mean? Well I have decided to actually put an effort into keeping this blog up-to-date with all the projects that I work on, and to share my code, so that everyone can benefit from my projects.

...The Big News
As part of this sort of New Years Resolution, I have decided to move away from a dedicated Blogger blog, and actually run my own website where I can have more then just a this very linear styled blog. Instead, now I will be able to post tutorials, and actually have dedicated pages for my code, implement nice syntax highlighting for the code I post, etc... So I'm very excited about this change. After nearly 6 months of working with Fatcow (a great web host IMO) on the www.spareourgreen.com project, I have figured out exactly what I do/don't want in a webhost, so now that I am getting a new website, I have been searching for the perfect host.

...A Perfect Fit
After hours of searching, reading review, visiting websites to look at every feature, and multiple talks with customer service reps, I was still unhappy with every option that I had found! I am being kind of picky, but I just knew that someone had to have everything I wanted (that also still has a reasonable price). On my list of need-to-haves were:
  • Shell Access (SSH)
  • Cron Jobs
  • FTP
  • Unlimited MySQL (but more preferably PostgreSQL)
  • PHP
  • Python
  • Perl
  • Ruby (would be nice, but can live without)
  • Some sort of SSL
  • IMAP & SMTP servers (or ability to change MX records to use Google Apps)
  • Some kind of control-panel / file manager
  • unlimited sub-domains
  • custom error-pages (preferably the ability to edit .htaccess)
  • And SSI would be cool, but not necessary
A pretty extensive list, nothing too arbitrary, but believe me its difficult to find all of that available in one package! Like I said, all my searching led me to nothing. Then after a weird Google search "php python perl postgre mx" or something like that, I came across yet another website that listed and compared web hosts, but this one was specific to finding good python hosts. I searched the page for "ssh" and found 6 instances, and clicked on all 6 of the links to the web host. The first 3 were dedicated python servers, then the 4th one I looked at had everything I wanted, and more! It has everything I listed above on my list, plus it has Git (and sub-version, but why use that if i have git) pre-installed, and they are already compatible with Magento!!! <3 This isn't all that important to me since I don't really plan on selling anything (at least not in the near future), but I just think this is really cool because about 2 or 3 years ago, I spent close to a week working on getting Magento installed on a home server with a bare-bone installation of PHP. It was such a pain in the ass, so that is why its just cool to see that they're pre-compliant with it. Now that you're all wondering who this amazingly awesome web host is, I won't be telling you!

Just kidding, the host I've been talking about it A2 Hosting. And all of this that I've mentioned above can be bought for as little as 4.77 a month is you are looking to commit for 36 months. I only chose the 6 month plan cuz I first have to prove to myself I'll actually use it, but if I can make it until the summer, then when I re-purchase the plan, I'll get the 36-month one.

...My New Website
Now I'm just waiting for the domain name to finish registration and be updated with the correct name-servers, and then I will start development of the site, and soon afterwards, I will be transferring this blog over to my site. The domain name is (just like this blog) bmbsquared.com. I look forward to developing this site, as I will try to include many new technologies, and make it very Web 2.0 ish. See you at the site.

- B-Squared

Sunday, January 17, 2010

Fun With AI

So at my robotics club, one of the other mentors and I are both extremely interested in AI, and he has been talkin with me about some different projects we're working on. Some of the younger kids became interested, and we started to describe to them the simplistic view of AI. He used the example of balancing a stick on a finger. Picking up an aluminum bar, he demonstrated how an AI robot would slowly learn to balance the stick. Then I gave the specific example of evolutionary algorithms (which i have the most experience with). He then said that evolutionary algorithms wouldn't be able to do that... but I was adamant that it could be done. So I've decided to prove him wrong by actually doing it.

...Starting Work
I've started writing a 2D simple physics engine to just be used for this balancing the stick problem. I will be utilizing the JGAP Java package to simplify the work I have to do, by providing all of the mutation and crossover algorithms. Basically this will boil down my work to providing it the correct functions that the program will be allowed to call.

...The Constraints
I have given some arbitrary constraints for this problem. I will allow it to work on a fake track that is a set length, probably 1 or 2 feet. Like i said, this demonstration will only be in two dimensions. As you can see this problem is actually not very complicated at all. Oh yea, this project will be in Java, and I'll post all code once I'm done.

I've been very lazy about coding over this break from school, so this challenge will hopefully kick me into gear and get me working much harder then I have been during my last week at home. To clarify, I actually program 40 hrs a week at my job, but outside of work, I haven't gotten much done.

Wednesday, January 6, 2010

third, second, FIRST!!!!!

Once again, all of our favorite robotics competition is about to start, and all of the teams are excited to receive this years unique challenge! Its FIRST Robotics. Every year, thousands of teams worldwide compete in this high school robotics competition.

...Atholton RAID
Last year was my final year at Atholton, so it was my final year competing on our HS FIRST team. But this year, I have come back as a team mentor! This evening we had our first meeting (no pun intended) and I had to catch up on a lot of information. Our club had nearly doubled in size from only 12 dedicated membors to 30! Everybody had already chosen their sub-teams, so we knew who would be focusing on electrical, design, programming, and hardware. On Friday we'll meet again, then Saturday is the kickoff. The whole club will gather and watch a live screencast of the presentation of this years game by famous inventor, Dean Kamen, and esteemed MIT professor, Woody Flowers.

...The Competition
This year, the game has been designed to make it much more interesting for spectators, such that there will hopefully be a non-technical audience to whom this can appeal! The game entails of teams of three robots, that face against each other. Each team has 2 goals on their side of the field in the corners. They can score 1 point by kicking a soccer ball into this goal. On the field, there are 12 soccer balls at a time. As they are scored, the human players must re-enter the balls onto the field through a ramp structure that is above the playing field. If you with to see more about the game, visit www.usfirst.org.

...Our Robot
Now just over a week into the 6 week competition, we have finalized the chassis design of our robot, all of the manipulators we'll be using, and have begun working on designs for the different manipulators. We plan on a long or short-range ball shooting mechanism, as well as a way to pull our robot up 20 inches off the ground using the tower that is built on the field. We are looking at linear pneumatic launchers, as well as potential rotational energy to launch the ball. We also will be using two camera's, one for ball viewing, and another for finding the target above the goals. This will help to keep the balls under control, and also to accurately measure distance from the robot to the goal. Our main objective with this is that we can have a "fire & forget" system. We will have a light on out driver station that will light up if the camera's see a ball and the target. If bot are seen, then the driver can just tap the trigger on our joystick, and the software will then move to get the ball into alignment with the target, compute its distance from the goal, and fire with accordingly. As far as our robot lifting mechanism, we are evaluating the feasibility of a scissor lift system that will pull up the robot. The final design concept that has been bouncing around is that of a system to pick up another robot, and carry it on top of us, as we lift ourself off the ground. This system would gain us an extra 3 bonus points at the end of the game, in addition to the 2 we get for pulling ourselves off the ground. I feel that if we can get this idea to come to fruition, it will most definitely be one of the top robots at the competition.

Tuesday, January 5, 2010

RPS Competition!!!

Anyone else as psyched as I am??? No??? Huh, thats weird. Well anyway, I have decided to host my own rock, paper, scissor tournament! Still not interested? How about if I told you that you had to program a RPS bot to play the game against another persons RPS bot? If you're still not interested then there's no hope for you... If you are interested though, then read on!

...The Idea
So after my daily reading of the Official Google Blog on Nov 25th, (maybe it was the student blog???) I saw that the University of Waterloo had hosted a rock paper scissors programming challenge. I was extremely interested. I looked over all of the sample code, and I read a few articles about logic that can be used in the game. This was my sole inspiration to hold my own tournament. As of this evening, I will begin to write the Java bot that will moderate the game. Hopefully I can finish the bot by the 23rd of Jan, so that I can make a website for it and begin to advertise around the UMD College Park Campus.

...The Competition
Then a week or two into February I'll open up registration so people will be able to test their robots. And every other day, I'll run the competition on all of the submitted code. For three weeks this will continue, and people will be able to tweak and re-upload their code to make it smarter. Then on the final weekend, (probably Saturday) I will host a big RPS party and everyone can submit their final programs and see who comes out to be the winner. I'll eventually think of prizes that the top three or so will win.

I hope that other people are as excited as I am about programming RPS artificial intelligence. I think it will be very difficult. As it gets closer to February, I will try to post some basic RPS logic so that people can get an idea of how to go about programming this.

Thinking about doing this competition also has me thinking about other AI competitions that I could run. If you have any ideas lemme know in the comments.

A Quest For The Holy Grail

...well, not really. But I am learning Python! After years of programming in just Java, I finally thought it would be a good idea to branch out. I then took on my website project I described in my last post, and from that have come to gain a good understanding of PHP and its useful libraries, SQL queries, and Javascript. But I wouldn't consider any of these a true full-functional language as they all are built on web infrastructure.

...Picking a Langue
Yes I know that language is spelled with a "age" at the end... but I meant to say langue. Langue's meaning is very close to that of language, the difference is that language is, well, a language, but langue (noun) means a system of conventions and rules upon which we communicate. So I would actually say programming languages are actually more Langue-like... if that makes any sense.

Anyway, after deciding that I needed to learn a new language I started researching all of the possibilities for what I could learn, based on which ones could possibly provide a benefit to my career. My final list was as follows:
  • C or C++
  • Python
  • Ruby
  • Erlang
  • Perl
  • LOLCode
First, I ruled out any with a lack of a true compiler/interpreter. KTHXBYE LOLCode :'( Now that I had 5 viable options I looked at things such as :
  1. Will I have to learn it in the future?
  2. What types of programs will it write?
  3. What kind of learning curve is there?
  4. How big is the user base?
My first question, will i have to learn it in the future, only targeted 1 of the items on my list. Since I am on the University of MD Robotics club, I will eventually have to learn C so that i can program PIC micro controllers for the robot. I will actually have to do this next fall. And this will also be needed for some EE classes later on in my junior and senior years most likely. For this reason, I ruled out C/C++. (Also, if needed I actually can churn out a small C program, i just never got further than any of the basics.)

Now I was left with Python, Ruby, Erlang, and Perl. I separated them into 3 categories: Perl, Python/Ruby, and Erlang because Python and Ruby are so similar. Next, I eliminated Erlang. Although I would have loved to learn a true multi-thread language, question 2 and 4, as well it being hard for me to find a legitimate use for it in my code, turned me away from Erlang. Although I have skipped it for now, I feel that the next time I decided to learn a new language from the ground up, it will be Erlang (hopefully in a 2 years or so).

...Python vs. Ruby vs. Perl
Now that I'm down to three, I once again grouped Python and Ruby together due to their similarity, and compared them to Perl. After looking at some code and reading all of the introductory tutorials, I felt much more comfortable with Python/Ruby then Perl. Then after reading other people's testimonials, such as this one, it wasn't a hard choice for me to eliminate Perl.

Choosing between Python and Ruby was difficult as they are sooo similar, and unlike the case of Python v Perl, no one was willing to say that one is truly better then the other. My final decision boiled down to the fact that Python is much more widely used (in my opinion), and more importantly to me, Google uses Python extensively. That pretty much sealed the deal, as I have an incessant infatuation with Google.

...Beggining to Learn
Now that I have chosen, it's time for me to begin to learn the language. I just purchased my book, and will be updating the blog on my progress and any interesting tid-bits now and then. But as this is my second project outside of my 40 hour work weeks, progress will most definitely be slow.

The New Year! ...and Correct PHP Code Design

Its been a while since my last post, but I've made it a sort of new years resolution of mine to start posting more regularly here. Now my focus will not be solely on AI stuff, rather I will cover any tech topic that I am learning about or using.

...Catching Up
So over the past few months, I have put a lot of time into developing my website, www.spareourgreen.com. If you actually want to hear about what the website does, go ahead and visit it, here I will just be covering the technology used. Never having written any websites other then straight HTML and Javascript, I was excited to learn PHP. After reading a book about PHP basics, I dove right in and started coding the first 5 or 6 pages. I finally got all of them working correctly, but then realized a problem I was having, every time I added a new menu item, I would have to add the new link to each page individually. ThenI came to the realization that I should dynamically load all of my web pages from the one centralized source, and keep individual page content in its own file. This way, updating the menu or the footer makes the change across all the pages.

...Another Challenge
Now, after nearly 4 months of working on this code, and having different pages load different content into their menu's and such, my code is once again a horrific mess! Now when a page breaks after adding a small new feature, I have to pick through hundreds of lines of unstructured code to try and find my problem. Usually I know where the problem lies, but even then it is still a pain to find. After recently adding a new section to my website that has vastly increased the complexity of my code, I have decided to wipe out all my old code (not really, never delete your old code, just .7z it and save it on your HDD), and re-write the website using well-structured, scalable, object-oriented programming in PHP.

With object orientation, making a broad change across all of my pages is as simple as adding a new function or method to an existing class, or adding a new class to my main .class.php file. Then, I can choose which pages this needs to be implemented on, and make 1 new function call on those pages. The way I am using the OOP actually could be implemented the same way I had done it previously by using just 1 php file and adding new methods, but like I saw, that can get pretty messy. Working with objects is much easier to keep tidy.

...Progress
Right now, I have just begun to undertake this new initiative. I have written a User class to store data about the person visiting the site, as well as a Page class which is what will be my initial call on every php page. So here is what a sample site could look like:

<php require_once(../classfile.class.php); $file = new Page(); ?>
........HTML or PHP page specific content goes here............
<?php $file->close() ?>

As I continue to work on this code, I will keep updating on my progress and try to provide information on any problems that I run into so that you may learn from my mistakes.