Scratching my head...I don't get it.

Discussion in 'QnA (read only)' started by DKant, Sep 26, 2004.

Thread Status:
Not open for further replies.
  1. DKant

    DKant New Member

    Joined:
    Jul 10, 2004
    Messages:
    265
    Likes Received:
    0
    Trophy Points:
    0
    Location:
    In a creaking plashtik chair
    I have 3 questions:

    1) Aren't Audio CD's supposed to sound better than Audio cassettes? But how can that be? Firstly there will be some data-loss due to the sampling (depending of course on the sampling frquency, and the permissible size of each audio-file), regardless of the efficiency of the reconstruction circuitry+algorithm. Whereas on an audio-tape, the magnetic signals are modulated, but original and analog signals nevertheless. SHouldn't tapes(new ones, without any physical damage) therefore sound better than CD's? :? :!:

    2) I understand that image and audio compression can be carried out by rejecting a certain amount of reconstrucible/imperceptible data, but how does data compression (I mean exe's, text files etc.) occur? :?

    3)Wherever I hear encryption I hear prime number in the same breath. I get it that prime numbers are one of the simplest(?, I could be terribly wrong here) and seemingly most secure means used to encrypt data, but won't some kind of a random encryption algorithm be more fool-proof? I mean one where the type of encryption depends on the word position in the file for example. :idea: :?

    Please clear these doubts of mine.

    I'm busy with my exams rite now, so I won't be able to check this thread for a coupla days. Keep the replies coming in anyway. ;)

    Thanx 2 all in advance.
     
  2. AlienTech

    AlienTech New Member

    Joined:
    Sep 17, 2004
    Messages:
    433
    Likes Received:
    0
    Trophy Points:
    0
    Even with analogue signals, there is only so much you can store. You can use better media to store more, but then you get into how much you can read back. Using specialized equipment you can keep going higher and higher but for normal regular folks, there is a cut off point where no matter what kind of tape you use, your equipment can only read so much info back. Same with digital, but you can store more data more accurately. Some people think they can hear the difference and it is possible. Those old records sound a lot better than tapes or Cd's. Even 8 track tape seems to sound better than regular tapes. So its how much bandwidth you have that determines how real it sounds. Remember, there are many items stored digitally on tape. Data backup for instance.. Now you can also go one higher and say, use quadrature modulation since you can store even more data.

    To compress data the simplest way would be to add a marker to indicate that data previously stored are used again.
    Eg.. in the previous line <data> is used twice. I could re write the line as..
    marker 1 = to
    marker 2= data
    To compress <use marker 2> the simplest way would be <marker 1> add a marker <marker 1> indicate that <marker 2> previously stored are used again.

    When you have many repetitions like this, it becomes useful and reduces file size. There are many ways to do this. Look up Huffman compression routines.

    Encryption is just another compression method. But data size reduction is not the primary factor, protecting the original data and making it harder to decode are. If you look at your typewriter keys, It does not go ABCDEFGHI.. So its already re-mapped. But since you know that, when you type, you use the keys in different places to be able to communicate with others. Since we all do it the same way. But what if we suddenly changed where the keys map to. Instead of ABCDEFGHI we use ASDFGHJ.. We press the same keys but the printout would be different. using prime numbers is more complex, I need to look that up. Mostly you use it because it is a constant and the ones who have the more powerful computers can decode messages. IE it is a method to make sure those lesser people have a harder time trying to break your code.
     
  3. GNUrag

    GNUrag FooBar Guy

    Joined:
    Jun 22, 2004
    Messages:
    1,246
    Likes Received:
    5
    Trophy Points:
    0
    Location:
    Interwebs
    You hear prime numbers everytime because, in mathematics, prime numbers are the one that occur randomly and there is no specific predecided pattern of its occurence.... this makes techniques of random guessing almost impossible if your cipher strength is quite high.... something like 1024 bits...

    However, Prime numbers are not the only thing that are random.... another algorithm can be developed using the vaule of PI as you know value decimal of PI (3.1415926 ............)is very very random any it cant be predicted anyhow.... Blowfish a popular algorithm for example is based on PI ....Read this paper on Blowfish < http://www.schneier.com/blowfish.html > .... Blowfish support is now natively present in Linux kernal also....

    Nope ... its not.... compression methods have a predictable pattern....
     
  4. sujithtom

    sujithtom New Member

    Joined:
    Aug 14, 2004
    Messages:
    512
    Likes Received:
    1
    Trophy Points:
    0
    Location:
    Not anywhere near you
    I will explain it to you in simplw words :wink:
    In data files especially in text files there are always repetition of bytes or charcters in a same pattern. We can replace these pattenr into one single byte or charcter eg. look at the below data

    AAAA BBBB CCCC ABC ABC ABC ABC CCCC BBBB BBBB BBBB BBBB AAAA AAAA

    Suppose we substitute for these data as shown below
    AAAA=1,BBBB=2,CCCC=3,ABC=4
    Then the data will be
    1 2 3 4 4 4 4 3 2 2 2 2 1 1

    Actual compresion isn't done like this but I think the concept is same..
    Check www.howstuffworks.com for more reference. 8)
     
  5. theraven

    theraven Active Member

    Joined:
    May 5, 2004
    Messages:
    2,912
    Likes Received:
    0
    Trophy Points:
    36
    Location:
    off to "never ever" land
    for compression
    there are algorithms
    now when u talk abt image compression etc ... and rejecting unwanted/repeated/reconstuctible data ... u arent talkin abt compression as such ... ur talkin abt image processing .. and how u cam make the file smaller
    compression comes in when u save the particular file ... supposing as a JPG image ... converting it to JPEG automatically compresses it using a particular algorithm MEANT for jpeg. ... like RLC encoding ... which is standard in jpg's ...
    thats y u see the difference in file sizes between say ... BMP's and JPG's ... without apparent loss in visual appeal
    other such algorithms are huffman coding ... and older yet .. aztech and fan algorithms ...
    algorithms such as these are used to compress other files .... THAT is knows as compression ...
    these algorithms ofcourse work in different ways ofcourse as per their definition ...
    lemme give a few examples ...
    PHEW ... need a break .. gettin tired ...
    ok .. back to it ...
    we all know everything is represented in 0's and 1's
    which is the binary code
    everythign has a binary code ... most commonly for no's it goes like this ...
    0=> 00
    1=> 01
    2=> 10
    3=> 11
    this is ofcourse just a sumple 2 bit binary address ... and therefore it has 2 raised to 2 ( 2 raised to n is a general formula) = 4 possibilities .. which is 0-3
    now supposing ... we have a sequence like .. 0000000111111222223333444556
    each being a different character to be stored ofcourse .. i didnt wanna type the commas ..
    so we can see saving this is quite a waste ...
    so algorithms such as aztech ( i knwo i changed the eg. just follow me here .. ill get back to it )
    we save the number .. and the times its repeated ... ( its a lil more complex than this ... when sloped and all come into play .. but this is just a simple explaination )
    now what we do in Huffman coding it ...
    we get the binary code of each ....
    thats how the computer will save it ofcourse ... the binary codes of each ..
    now we use this algorithm ... or method so defined that we get a modified code from the binary code .... ( known as Huffman code) such that ... the most occurin no. has a small code ... and the least occurin no. having the larger ones ..
    therefore savin up space ... and leading to ....
    u guessed it ... => COMPRESSION !!
     
  6. OP
    OP
    DKant

    DKant New Member

    Joined:
    Jul 10, 2004
    Messages:
    265
    Likes Received:
    0
    Trophy Points:
    0
    Location:
    In a creaking plashtik chair
    Alrite I get the compression part. In fact I thought abt sumthin' similar (i.e tagging repeated data) for my 12th std. project....I ended up 'compressing' a 20KB file to 1MB!!!! :D Will have to spend more time on Google..or Clusty :p

    Moving onto encryption....OK now how's the data decrypted? Does the receiver node just go through all the possible prime numbers and then say.."BINGO!!" :?
     
  7. go4inet

    go4inet New Member

    Joined:
    Feb 18, 2004
    Messages:
    300
    Likes Received:
    0
    Trophy Points:
    0
    Location:
    Chennai
    stop scratching & act wise.... thats the way u learn.........
     
  8. OP
    OP
    DKant

    DKant New Member

    Joined:
    Jul 10, 2004
    Messages:
    265
    Likes Received:
    0
    Trophy Points:
    0
    Location:
    In a creaking plashtik chair
    lol. It starts with scratching :razz: Friction can create fire...remember? :)
     
  9. Deep

    Deep Version 2.0

    Joined:
    Jan 23, 2004
    Messages:
    977
    Likes Received:
    0
    Trophy Points:
    0
    Location:
    Mumbai
    ignore him..
    he is just posting increase the post count lol
     
Thread Status:
Not open for further replies.

Share This Page