On 4/27/06, Joel Jaeggli joelja@darkwing.uoregon.edu wrote:
On Fri, 28 Apr 2006, Mike McCarty wrote:
Joel Jaeggli wrote:
On Fri, 28 Apr 2006, Tim wrote:
On Wed, 2006-04-26 at 19:32 -0700, Rob wrote:
Am I right, that the throughput here is 1 Giga BYTES, which is 8 Giga bits?
Network card speeds are listed in bits per second. These are the three most common speeds:
10 mega bits per second 100 mega bits per second 1 giga bits per second
Don't ask me whether they're playing the SI or bullshit game regarding mega and giga equalling millions and billions, or using 1024 multipliers.
Bits are in fact bits in this case. When you talk about packets or frames you tend to use bytes or octets (same thing) but line rate is bits per second.
I think the issue raised was whether 1 Gbps is 1024*1024*1024 bps or 1000*1000*1000 bps.
1 billion bits per second is 1*10^9 bits, contrast with 2^30 bits which bc says is 1073741824
http://en.wikipedia.org/wiki/Gigabit
Mike
--
Joel Jaeggli Unix Consulting joelja@darkwing.uoregon.edu GPG Key Fingerprint: 5C6E 0104 BAF0 40B0 5BD3 C38B F000 35AB B67F 56B2
The OP should look up IEEE 802.3 and the formulation of the various packets/frames in the standard. Using the ping data and etheral(?) he can reconstruct each frame, count the data transmitted/received per unit time and computed the throughput for his system.