Friday 5 October 2012

Sunday Times University Guide 2013

It's finally that time of the year when the UK university league table came out. The Sunday Times provides one of the most respectable guide in ranking the university with the following criteria.
  • Student satisfaction
  • Teaching excellence
  • Heads and peer assessment
  • Research quality
  • A-Level points
  • Unemployment 
  • Firsts and 2:1s awarded
  • Dropout rate
The Sunday Times' league table came out as recently as early October 2012. The top ten British universities are depicted below.


Rank number 3, the University of Bath has been highlighted for personal pride only.  haha! It's only because I got both my Bachelor of Science and Doctor of Philosophy from there.  ^_^

This is the university's highest ever ranking in the guide, as said here.

Since this blog is about computer science, I feel the need to announce the Sunday Times computer science league table also The top ten British universities for computer science are:

(I have to thank my colleague, Dr.Tom Crick, for this computer science league table.)


Wednesday 3 October 2012

The Number of Keys Needed for Secure Sharing on Online Social Networks

Now, that is the title of my paper. It was presented at the 3rd Annual International Conference on Inforcomm Technologies in Competitive Strategies (ICT 2012). The conference was held in Bali, Indonesia in the middle of September 2012.

There were many participants from around the globe, including (but not limited to) USA, Germany, Australia, Finland, Malaysia, India and Thailand (that's me!).  The papers presented there were of good quality. The more interesting ones were on parallel processing using GPU. It was presented by Dr.Ming Ouyang, who is an assistant professor at the Department of Computer Engineering and Computer Science, University of Louisville, USA. What he said basically was that nowadays comparing a normal CPU with a GPU,the GPU is more suited to parallel processing due to its architecture. I spoke to him afterwards. He said that one of his students used a GPU to process intrusion detection mechanism and the results were pretty impressive. That is, it could run approximately four times faster than running this intrusion detection mechanism on a normal CPU. Having heard that I began to think how interesting it would be to use a GPU to run cryptography and even to carry out cryptanalysis. (Sorry, forgot to say that information and network security is really my research area.)

Anyway, enough about that. Let's talk a little bit about the paper I presented. The paper I presented was titled "The Number of Keys Needed for Secure Sharing on Social Networks." I reason I wrote this paper is because I realised that cryptography had not been widely used in social networks as much as it should be to give better privacy to information. One reason was that the number of keys that needed to be held was large. Therefore, I had to find a way to reduce that number, and this is what this paper was about. 

Below is the abstract of paper. The full version can be read here.
  
Online social networks have become an essential tool for communications these days. With popularity come security problems, especially with information privacy. One way to solve this problem is to use cryptography. However, cryptography on online social networks has not been studied exclusively. Most works have been done on access control. The main issue with cryptography is the number of keys needed to encrypt and decrypt the information. The most obvious number of keys would be to use one key for every user in our group of friends. This is not entirely true as we show here. This paper, therefore, gives an attempt to show that the number of keys needed to achieve secure sharing among friends can in fact be fewer than the number of friends. We also provide proofs of correctness and security to confirm our claim.

This paper was also given the Best Research Paper Award by the Conference.

 

Saturday 25 August 2012

Authentication, Pre-Handoff and Handoff in Pure MANET

Yes ... This is my Ph.D. thesis title. I am not trying brag or anything, but I just think that my thesis is worth mentioning here. Even though it has been published since 2006, I don't think that it has been publicised enough.

The problem with mobile ad hoc networks at the time of doing the thesis was that there was literally no suitable security protocols for authentication and handoff processes. Therefore, I felt that there was a need to design one.

Below is the abstract of the thesis:

"A mobile ad hoc network (MANET) is a relatively new networking paradigm. Mobile ad hoc networks are more exposed to security threats due to the lack of physical security. That is why the introduction of security protocols, such as authentication and secure handoff protocols, is necessary. This is especially the case for private local mobile ad hoc networks. This thesis is a progress towards producing a secure and efficient authentication protocol as well as a secure and efficient handoff protocol for private local mobile ad hoc networks.
Our investigations show that there are performance and security problems with the existing authentication and keying mechanisms. We design and develop an authentication protocol, which mitigates those problems, using a combination of well-known cryptographic tools of RSA and Diffie-Hellman. The ns2 simulations demonstrate that authentication and key establishment between two mobile nodes can be accomplished in just four messages, and in approximately ten milliseconds. The protocol has been analysed and proved secure according to the BCK and CK approaches, and the GNY logic.
The pre-handoff protocol has been designed so that mobile nodes will hold the necessary information essential for achieving fast handoffs. This stage is an extension to the normal exchange of Hello messages among mobile nodes, which occurs whenever there is a change in network topology. The multipoint relay (MPR) algorithm is also integrated into the protocol to improve the efficiency.
One feature of MANETs is the topological instability. Currently, no handoff protocols for pure MANETs are available. We, therefore, introduce an efficient and secure handoff protocol, which is suitable for the movement of any mobile node within that domain, i.e. micro-mobility. Our approach again uses the combination of RSA and Diffie-Hellman protocols. We show that a handoff can be accomplished in just five messages, and without relying upon any third parties. The ns2 simulations demonstrate that, on average, it takes less than twenty milliseconds to complete the handoff process, which is more than twice as fast as the time recommended by the ITU. The protocol has also been proved secure and correct according to the BCK and CK formal models, and the GNY analysis."

If interested you can see the full version of my thesis here. 

For your information, even though the thesis was done at the University of Bath, it was awarded a 2010 national thesis prize from the National Research Council of Thailand (NRCT). I received the prize from Mr.Abhisit Vejjajiva, who was the Prime Minister of Thailand at the time and is the guy on the left of the picture below.

Friday 17 August 2012

Is Andrew Secure RPC secure?

A couple of years ago, while I was preparing for a lecture on Authentication Protocols for my Advanced Network and Information Security class at King Mongkut's University of Technology North Bangkok, I came across a protocol known as Andrew Secure RPC. I was wondering right there and then whether or not the protocol was really secure as its name suggested.

Andrew Secure RPC was first introduced in 1989 in a paper called "Integrating Security in a Large Distributed System."  It is a protocol that allows two entities, already shared a secret key, to agree upon a new key.

I then analysed the protocol by using the logic of me preferred choice, GNY logic, and found that the protocol was not secure, as expected. I, therefore, looked a little deeper and found several papers that came to the same conclusion. Those papers also put an effort in improving the protocol to make it more secure.

However, I showed in my paper, "Some Remarks on Andrew Secure RPC", none of them were secure. Not even the improved versions. What I did next was "re-designing" the protocol to make it more secure. The "new" protocol only consisted of three messages (rather than four in the original protocol). It also mitigated the vulnerabilities of the previous protocols, namely known-plaintext attack and session hijacking.

If you are interested, the paper of mine was already published at the 10th International Conference on Innovative Internet Community Systems in 2010.

The reason I am writing this post here is because RPC or Remote Procedure Call is still widely referenced and used in literature, and I just feel/felt that it would be better to make it as secure and efficient as possible.

Tuesday 14 August 2012

Writing a Term of Reference (TOR)

OK .. This post may not be a computer science topic per se, but it's something that I have always wanted to say. No no ... I am not going to write a great deal on how to write a TOR or what a TOR is.  It's just going to be on something that I have experienced over the years.

I have had a bit of experiences helping both government and private sectors come up with a TOR for a system or systems that they wanted. Of course, all of them were IT-related.

For those of you, who have written a TOR before, you would feel or know that it is such a tedious thing to do.  Anyway, what I don't understand is the actual process of it. No matter where I go or am involved in, the process of writing a TOR is no different. Maybe it's just my own feelings that it should not be the way it is.

In my opinion, writing a TOR should be a process of informing people/companies/dealers what we want.  That is, we should be stating the problems we are facing. I don't even think that it is our job (the TOR writers) to specifically state what kind of a system is needed to solve these problems. Let alone list all the functions and features of a system!

Writing a TOR should just be as simple as this: "Here are my problems ... What do YOU propose?"

In reality, haha ... far from this.  Well, from what I have experienced anyway.  An organisation (with money to spend, of course) would do one of these two things:
  1. Know the problems. Look for a system that (they think they want and hopefully) can solve the problems. Copy and paste the function and features of THAT system into the TOR, or
  2. Know the problems. Invite a dealer or two and ask them to list functions and features the solution! (er ... wouldn't they just write functions and features of what they are already selling or having in stock?)
What I'm trying to say is that why do people feel the need to waste so much time to write in so much detail of all functions and features of the required system (when they haven't got a clue what they are talking about)?  Why don't they just state the problems and let the "professionals" propose the solutions?

You will then have several options to choose from. Of course, you will choose the solution that solves the problems and gives the value for the money.

Strange, but this is what really happens.

Monday 13 August 2012

Everything begins here.

You are probably wondering who I am. Well, here is a brief description about the blog owner, i.e. me. hehe!

This blog is mostly going to be on all things related to computer science.

Why computer science, you ask?? I've studied computer science all my adult life. I have been working in the fields of computer science for many years and am now feeling like I am ready to share my ideas and experiences on it.

Let us enjoy what I call "Computer Science: The Science You Need."