Archive for the 'Computer' category

Capturing and Analyzing FTP Traffic

 | October 31, 2011 5:56 pm

FTP is one of the oldest network protocols still in use. In its first iteration, it was created in 1971 as a way to quickly move files between computers and has been in continuous use ever since. It’s particularly common on the web, where it is responsible for moving files and data.

Unfortunately, while common, it is also insecure. FTP transmits user credentials, file contents, and other data in the clear. For that reason, anyone with a packet sniffer and a bit of patience is free to take a look at it.

This video looks at the security of FTP traffic. It covers:

  • How to set up an FTP server on Windows Server 2008 and configure a simple site
  • The use of a packet sniffer (Wireshark) on Ubuntu to monitor network traffic

You can watch the video here.

Securing a Network: Part 1

 | October 26, 2011 6:31 pm

As difficult as it can be to secure individual computers, making sure that a network is secure is even more challenging. This because, instead of working with a single machine, you have an entire network of devices to worry about. It’s a classic case of, “if the security of one is threatened, we’re all threatened.”

Luckily, there are several tools that can be used to “harden” individual computers, thereby making the network as a whole more secure. This series of videos will explore a few of those, including the Windows Server Security Configuration Wizard, the Role of Security Templates, and some of the Linux/Unix Security best practices.

This first video kicks things off by looking at the Windows Server 2008 Security Configuration Wizard and shows you how to configure a simple firewall setting.

You can watch the video by going here.

Installing and Configuring DHCP on Windows Server Core

 | 1:21 am

Note: This is a cross post from Apolitically Incorrect. If you would like to comment or remark, please consider stopping by.

Windows Server Core is a relatively new version of Windows Server. Like it’s slightly more mature sibling, the “full” version, it is tremendously powerful. Server Core allows you to set up Active Directory domains, DNS/DHCP, and web servers. It can help secure your infrastructure, and probably floss your teeth.

But that isn’t what makes it interesting. Server Core is interesting for what it doesn’t have: the Windows Server GUI. Like in the case of Linux servers, nearly all of the action happens in the command line. This makes Server Core light weight and an excellent candidate for network virtualization, as it can run all of the core networking services need to administer a domain.

In this video, we take a look at how a Server Core installation can be configured to run as a DHCP server. It will walk you through the process of installing the DHCP server role from the command line, registering the DHCP service with Active Directory, and configuring the first zone. When combined with the earlier Active Directory tutorial, this video describes a way to run the three core networking services needed for domain administration – DNS,  DHCP, and Active Directory – on a single server.

This lays the groundwork for later networking and security tutorials by allowing us use the less resource intensive Server Core for simulation and exploration rather than the full Windows Server virtual machine.

To view the video, please visit blog.oak-tree.us.

Hannibal, Napoleon, and Joseph Charles Minard

 | February 22, 2010 5:49 pm

Charles Minard - Railroad Routes

No study of the history of scientific communication can be complete without mention of Joseph Charles Minard, a 19th Century French civil engineer and cartographer.

At the end of his life, Minard created two very famous examples of statistical charts, called flow maps, that every scientist, engineer and student should be familair with.  The first showed Hannibal’s crossing of the Alps (218 BC, Second Punic War), and the second describes Napoleon’s disastrous invasion of Russia (1812-1813).

Both examples are beautiful works of art and masterful examples of evidence.  But they are also more than that, they tell cohesive and interesting stories.  In this post, I thought it might be interesting to take a closer look at the history of Hannibal and Napoleon, and highlight the ways which Minard’s charts help us to explain their eventual outcome.

(Note: High resolution, PDF versions of the two maps are available for download.  These versions have been translated from the original French.  To download, either click on the images, or here for the Hannibal invasion of Northern Italy, and here for the French Invasion of Russia.)

Show me more... »

Typeset Your Curriculum Vitae – Part 3: Automatically Generate a List of Publications

 | December 2, 2009 11:19 am

Publications are the currency of ideas.  Through them the experts, thinkers and dreamers of this world can share their thoughts and insights.  A good publication is not only influential, but it’s even capable of shifting the course of a whole society, as Martin Luther King demonstrated with his “Letter from a Birmingham Jail”.

Since publications are so important to the dissemination of knowledge, there is a rather high expectation that an academic author should publish prolifically.  The mantra “Publish or Perish” is not just a clever quip, but a very serious way of life.

It is ironic, then, that the most prolific of academic writers can suffer from a surprising problem: it can be very difficult to keep track of all of their work.  Yet, an up to date CV is very important.  After all, publishing your work in influential journals is an important first step toward establishing tenure!

Members of a research team or those who collaborate outside of their institution experience this same problem, only more so.  Such a person may work on many projects at once, but only have direct responsibility for one or two of them.  This places the researcher in the unenviable position of trying to track the work of others.  This situation becomes even more complicated if the collaborator refuses to play by the rules of common decency.

It would be nice, for example, if the primary author of a publication would notify the co-authors of its progress, or when it has been submitted.  But … that doesn’t always happen.  Academic researchers are busy people and soliciting feedback from all of your collaborators can be difficult … and there is a tendency for difficult things to go undone.  Thus, if you don’t follow what your team mates are working on, it is quite possible that an abstract might have gotten submitted while your back was turned.

To stay on top of the “delightful chaos”, you need to have some kind of system.  Personally, I keep my list of projects and publications in three places. The first (and perhaps most important) is the hand-written list in my experimental notebook. Any time I hear about a new project, it gets added to this list. I keep track of what I’ve contributed, what papers or abstracts have been created from the data, and what their status is. When I know that an abstract or paper has been accepted, I then create an entry for the item in my bibliography manager. Once in the bibliography manager, I can cite the reference in other documents such as proposals or related papers.

About once a year, I go through the tedious process of updating my CV. This typically involves manually sorting through both my project list and my reference database and account for new items or reconcile differences. Every time I do this, it's painful; and because I’ve historically formatted the reference list by hand, it's not uncommon for a typo to sneak its way in or for an author to accidentally get left off of a citation. These mistakes are never intentional, but they do happen.

When I find such an error in the reference database, I fix it. But since I often import these references from websites, the errors tend to be few and far between. Moreover, my reference database is something that I use every day; as a result, it gets a lot of scrutiny. My CV, on the other hand, gets updated much less frequently and errors tend to persist longer.

For a very long time, I've wanted to automate the process. Instead of keeping three separate lists – active projects, reference database, and CV – I’d prefer to keep only one (or two). But I've never found a really satisfactory way of doing so.  Or at least I hadn’t found a system until quite recently.

In my last review of different ways to typeset a CV, I came across an interesting article by Dario Taraborelli.  In it, he described how to create a CV based on the standard “article” document class.  It was well designed, elegant, simple and attractive.  From his work, I created the xetexCV document class.  Additional research turned up an add-on module that makes it convenient to automatically generate a list of publications.  So, for the first time  in a great while, I have finally found a way to automatically generate a publications list in a simple and automated manner.  In this article, I will demonstrate how that is done.

Show me more... »

Typeset Your Curriculum Vitae – Part 2: Extending and Customizing an Existing Document Class

 | November 30, 2009 2:54 pm

Many first-time users of LaTeX often mistakenly look at the language as a a type of glorified word processing software – albeit a particularly complicated one.  While such an analogy may be apt in helping new users become acclimatized to the language, it suffers from a rather nasty problem: LaTeX isn’t a word processor.

If anything, LaTeX shares more in common with a programming languages than any type of application.  In fact, the document processing system is really nothing more than a bunch of re-usable pieces of programming called macros.  Everything is a macro.  That includes the commands that every user is familiar with: \title{}, \section{}, \subsection{}; in addition to the internal formatting commands that allows LaTeX to function.  (Most of the macros were originally created or packaged by Leslie Lamport as a way of making TeX – the typesetting system created by Donald Knuth – easier to work with.)

This has some rather practical consequence; because everything in LaTeX is a macro, it is far more extensible than a word processor could ever hope to be.  If you require a feature that doesn’t yet exist, it typically isn’t all that difficult to add it.  And when your extension is packaged inside a style or class, you can use those customizations in anything that you want to write.

But though creating macros isn’t particularly complicated, it is a different beast than just using the stock macros for writing.  This is not surprising, the craft of design is inherently different than the craft of writing.  There are different conventions to follow and different topics to obsess about.  In the first article of this series, I introduced the xetexCV document class, which is one example of where I decided to don the designer hat.

But before you get too far down the road of customizing and extending, there are a some important things that you need to know.  These include the general conventions used when working with document classes, their internal anatomy, an understanding of how macros are created, and how to handle formatting and layout challenges.  In this article, I will look at these issues more in detail, particularly as they pertain to xetexCV.  In the process of reviewing these topics, I will also explain some of my design choices.

Show me more... »

Typeset Your Curriculum Vitae – Part 1: The xetexCV Document Class

 | November 25, 2009 12:02 am

Very few documents are more personal than a curriculum vitae (CV).  A CV lists a person’s educational history, who they’ve worked for and what they’ve accomplished.  Moreover, a CV is frequently used to judge a person’s inherent worth and value (or at least exploitability).  A quality curiculum vitae matters, a lot.

For that reason, a CV not only needs to include all the pertinent information of a person’s life, but it also needs to look good. An attractive CV with good spacing and contrast leaves a positive impression and makes it easier to find information.  When laid out correctly, a reviewer might just find themselves scouring past accomplishments for interesting tidbits: “I didn’t realize that this applicant organized a lecture series with Patch Adams and other notables, that’s interesting!”

Show me more... »

Customizing LyX: Character Styles and the LyX Local Layout

 | November 14, 2009 5:00 pm

Imagine for a minute that you’re writing a book or technical manual.  Let’s say it’s a book on technology, maybe the open source tools used for scientific writing (to randomly pick an example).  As you write this book, you realize that you need some way to cue the reader into different parts of the text.

For instance, you might want all definitions to appear in bolded text so that a reader pick out key terms quickly.  Or you might want code examples to appear in a different font than the regular text, again, so they’re easy to find.  What’s the best way to do this?

Sure, you could just bold the definitions, or manually change the font for the code examples.  But that’s painful!  Changing typeface and size every time that you have a section of code will eventually result in a lot of lost time.  Moreover, you might make a mistake, which destroys your consistency and makes your writing look unprofessional.  There must be a better way!

Thankfully, there is.  It’s through the consistent use of styles.

Show me more... »

Statistics With R – Part 1: An Old Dog Learns New Computing Tricks

 | November 8, 2009 10:21 pm

When doing math or numerical analysis, the knowledge of the technique is far too often tied to the tool performing the calculation.  Consider an engineer whose understanding of the Fast Fourier transformation is inseparably tied to the fft function in Matlab.  Of course this hypothetical engineer understands what the results mean (more or less) but may not be able to duplicate his analysis if Matlab were taken away.

In most cases, it is likely that no deeper understanding will be required.  But what happens if the computer makes a mistake?  Or the program becomes unavailable?  Both situations are entirely possible.  Computer algorithms aren’t perfect and occasionally arrive at results make little sense; and hardware has been known to fail.

When the engineer understands how the computer arrived at the answer, however, he can recognize, understand, and ultimately correct those cases where the results are unexpected.  This is an important reality check that can prevent costly disasters later down the line.  Or, if the hardware is unavailable, he can use an alternative tool or software package to duplicate the analysis.

But while such a situation can arise with any type of numerical software, it’s most likely to happen to users of a statistical package.  I find this extremely ironic since a proper understanding of statistics is essential to live in the modern world.  (Much more so than an understanding of the Fast Fourier transform, at any rate.)  The rules of probability, the normal curve, correlation, and multivariate statistics can have a direct impact on how we live our lives.  They are used in making important decisions in finance, medicine, science and government.  A misunderstanding of stats and the methods of science (from which statistics is inseparable), underlies the most divisive issues of our day: abortion, stem cell research, and global warming.

Moreover, neither side has a monopoly on ignorance or misunderstanding.  People fail to distinguish between correlation and causality, or insist in using the word “average” as a slur.  Nearly as bad are those that – like the hypothetical engineer described above – only understand statistics within the narrow context of their stats package.  Casual statisticians are nearly as dangerous as the wholly uninformed.

The Statistical Package for the Social Sciences (SPSS), is one of the biggest perpetrators of this crisis.  Which is hugely ironic, because I happen to love SPSS.  SPSS is probably the first statistical package that has placed advanced statistical methods within the grasp of the novice user.  I’ve been a happy user for nearly a decade (ever since I was introduced to the program in high school).  But there is no doubt that I’ve come to understand statistics within the context of SPSS and its GUI.

Please don’t misunderstand me, I have a pretty good grasp of basic statistics.  I can sling probability with the best of them and take relish in describing when to use the Fischer Exact test instead of a Chi-Square; but advanced statistics are a completely different matter.  Advanced stats scare me.  I can certainly use these more complicated methods.  I’ve analyzed and written about multi-variate models and even ventured into Analysis of Variance (ANOVA).  But I have to rely on SPSS and the aid of my institution’s biostatistician to help me recognize when there is a problem.

Which is why, in a time of tight budgets, losing the institution’s SPSS license has been a crushing blow to my productivity.  (Whoever made that decision should be hauled out and shot!)  Because I don’t have my statistics software any more, there are certain aspects of my job that are much more difficult to do.  And unfortunately, there is only logical conclusion to draw: I’ve become a victim of the statistical ease of SPSS.

Show me more... »

Customizing LyX: Create an NIH Grant Proposal Template

 | November 2, 2009 6:21 pm

LyX is a wonderful writing program.  It’s easy to use and produces beautifully typeset output.  More importantly, though, it lets an author focus on the content and structure of his writing; rather than the formatting.  It isn’t so easy to customize, though, which limits its usefulness in a big way.  What if you need to create a new layout or take advantage of one of the thousands of specialized  LaTeX styles?  How, exactly, do you go about doing that?

That’s why this article was written.  Recently, I was asked to help with a National Institutes of Health (NIH) R21 grant proposal.  After some talk amongst the different investigators, it was decided that we would use LaTeX and LyX to draft it.  Unfortunately, we hit a rather substantial hurdle early in the process:  LyX doesn’t have an NIH grant template.

After additional debate, we decided to proceed with LyX anyway.  But in the process, I found myself saddled with an additional job.  In addition to responsibilities as research flunky and copy editor, I was tasked with creating a LyX and LaTeX template for our NIH grant.  This article will summarize the steps I took and describe how to create a custom template using an available style on CTAN.

image

Note: All of the files in this tutorial can downloaded here (.zip).

Show me more... »