In Java does anyone use short or byte?









up vote
28
down vote

favorite
5












Apart from using (byte) in streaming I don't really see byte and short used much. On the other hand I have seen long used where the actual value is |100| and byte would be more appropriate. Is this a consequence of the relative inexpensive nature of memory now or is this just minutia that developers needn't worry about?










share|improve this question



























    up vote
    28
    down vote

    favorite
    5












    Apart from using (byte) in streaming I don't really see byte and short used much. On the other hand I have seen long used where the actual value is |100| and byte would be more appropriate. Is this a consequence of the relative inexpensive nature of memory now or is this just minutia that developers needn't worry about?










    share|improve this question

























      up vote
      28
      down vote

      favorite
      5









      up vote
      28
      down vote

      favorite
      5






      5





      Apart from using (byte) in streaming I don't really see byte and short used much. On the other hand I have seen long used where the actual value is |100| and byte would be more appropriate. Is this a consequence of the relative inexpensive nature of memory now or is this just minutia that developers needn't worry about?










      share|improve this question















      Apart from using (byte) in streaming I don't really see byte and short used much. On the other hand I have seen long used where the actual value is |100| and byte would be more appropriate. Is this a consequence of the relative inexpensive nature of memory now or is this just minutia that developers needn't worry about?







      java






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 9 at 18:16









      Stephen Kennedy

      7,039134866




      7,039134866










      asked Oct 8 '09 at 19:00









      non sequitor

      8,31663858




      8,31663858






















          10 Answers
          10






          active

          oldest

          votes

















          up vote
          33
          down vote



          accepted










          They are used when programming for embedded devices that are short on memory or disk space. Such as appliances and other electronic devices.



          Byte is also used in low level web programming, where you send requests to web servers using headers, etc.






          share|improve this answer






















          • That's why I don't see them, I'm never looking at appliance or electronics source code, cheers -- coming to think of it, that was the original intention of Java before it made waves in applets and then took off.
            – non sequitor
            Oct 8 '09 at 19:54










          • but then again, isnt java too slow and too big for appliances? isnt c/c++ the norm there?
            – J. K.
            Dec 23 '15 at 16:58










          • It is true to say that Java is used less in embedded devices, but still quite a bit. Java ME is quite small and ARM processors have a mode that can execute JVM bytecode directly. Garbage collection is not predictable though, so most real time applications would use lower level programming.
            – mjaggard
            Nov 22 '16 at 7:55










          • "devices that are short on memory" — No pun intended.
            – MC Emperor
            Nov 12 at 10:57


















          up vote
          21
          down vote













          The byte datatype is frequently used when dealing with raw data from a file or network connection, though it is mostly used as byte. The short and short types are often used in connection with GUIs and image processing (for pixel locations & image sizes), and in sound processing.



          The primary reason for using byte or short is one of clarity. The program code states uncategorically that only 8 or 16 bits are to be used, and when you accidentally use a larger type (without an appropriate typecast) you get a compilation error. (Admittedly, this could also be viewed as a nuisance when writing the code ... but once again the presence of the typecasts flags the fact that there is truncation happening to the reader.)



          You don't achieve any space saving by using byte or short in simple variables instead of int, because most Java implementations align stack variables and object members on word boundaries. However, primitive array types are handled differently; i.e. elements of boolean, byte, char and short arrays are byte aligned. But unless the arrays are large in size or large in number, they doesn't make any significant contribution to the app's overall memory usage.



          So I guess that the main reason that developers don't use byte or short as much as you (a C developer?) might expect is that it really doesn't make much (or often any) difference. Java developers tend not to obsess over memory usage like old-school C developers did :-).






          share|improve this answer






















          • If you tried to obsess over memory usage, I think Java would drive you mad. However, as a Java Dev myself, it's nice to know that I'm 'doing my bit' so I always choose the most succinct type where possible. It's not only about memory usage, it's about clarity.
            – Chris Hatton
            Dec 31 '14 at 3:28






          • 3




            @ChrisHatton - Well "your bit" is probably wasted effort (or harmful) if you are concerned about memory usage and performance. Seriously, undirected micro-optimization is usually wasted effort. And I already mentioned the clarity issue.
            – Stephen C
            Dec 31 '14 at 3:33


















          up vote
          13
          down vote













          In a 64-bit processor, the registers are all 64-bit so if your local variable is assigned to a register and is a boolean, byte, short, char, int, float, double or long it doesn't use memory and doesn't save any resources.
          Objects are 8-byte aligned so they always take up a multiple of 8-byte in memory. This means Boolean, Byte, Short, Character, Integer, Long , Float and Double, AtomicBoolean, AtomicInteger, AtomicLong, AtomicReference all use the same amount of memory.



          As has been noted, short types are used for arrays and reading/writing data formats. Even then short is not used very often IMHO.



          Its also worth noting that a GB cost about £80 in a server, so a MB is about 8 pence and a KB is about 0.008 pence. The difference between byte and long is about 0.00006 pence. Your time is worth more than that. esp if you ever have a bug which resulted from having a data type which was too small.






          share|improve this answer






















          • Very good reminder about the boundaries. The calculation was also fun, but I'm pretty sure int or long weren't made just for avoiding bugs. It's more of a habit thing.
            – Charles Roberto Canato
            Apr 30 '14 at 22:56

















          up vote
          5
          down vote













          I would most often use the short and byte types when working with binary formats and DataInput/DataOutput instances. If the spec says the next value is an 8bit or 16bit value and there's no value in promoting them to int (perhaps they're bit flags), they are an obvious choice.






          share|improve this answer



























            up vote
            3
            down vote













            Arithmetic on bytes and shorts is more awkward than with ints. For example, if b1 and b2 are two byte variables, you can't write byte b3 = b1 + b2 to add them. This is because Java never does arithmetic internally in anything smaller than an int, so the expression b1 + b2 has type int even though it is only adding two byte values. You'd have to write byte b3 = (byte) (b1 + b2) instead.






            share|improve this answer




















            • I'm just learning java and this is something I initially struggled with. I couldn't understand why my book always used the int type. For example it didn't make sense to me to use the int type for number of days in a month. So I would change it in my code and would later run into issues.
              – Jeffpowrs
              Aug 17 '13 at 16:45

















            up vote
            2
            down vote













            I used short extensively when creating an emulator based on a 16-bit architecture. I considered using char so I could have stuff unsigned but the spirit of using a real integer type won out in the end.



            edit: regarding the inevitable question about what I did when I needed the most significant bit: with the thing I was emulating it happened to almost never get used. In the few places it was used, I just used bitwise modifiers or math hackery.






            share|improve this answer





























              up vote
              1
              down vote













              I think in most applications short has no domain meaning, so it makes more sense to use Integer.






              share|improve this answer



























                up vote
                1
                down vote













                short and others are often used for storing image data. Note that it is the number of bits which is really important, not the arithmetic properties (which just cause promotion to int or better.



                short is also used as array indexes in JavaCard (1.0 and 2.0, IIRC, but not 3.0 which also has an HTTP stack and web services).






                share|improve this answer



























                  up vote
                  1
                  down vote













                  byte happens all the time; buffers, specifically for networks, files, graphics, serialization, etc.






                  share|improve this answer



























                    up vote
                    0
                    down vote













                    Most of the time, there's never a real good technical reason for a developer (Java, C#, BASIC, etc.) to decide for an int, short or byte - when the capacity is enough, of course. If the value will be under 2 billion then int it will be.



                    Are you sure we'll have people older than 255? Well, you never know!



                    Aren't 32,767 possible countries enough? Don't think too small!



                    In your example, you can be perfectly happy with your byte var containing 100, if you are absolutely sure than it will NEVER overflow. Why do guys use int the most? Because.... because.



                    This is one of those things that most of us just do because we saw it that way most of the time, and never asked differently.



                    Of course, I have nothing against "all things int". I just prefer to use the right type for each kind of value, no stress involved.






                    share|improve this answer




















                      Your Answer






                      StackExchange.ifUsing("editor", function ()
                      StackExchange.using("externalEditor", function ()
                      StackExchange.using("snippets", function ()
                      StackExchange.snippets.init();
                      );
                      );
                      , "code-snippets");

                      StackExchange.ready(function()
                      var channelOptions =
                      tags: "".split(" "),
                      id: "1"
                      ;
                      initTagRenderer("".split(" "), "".split(" "), channelOptions);

                      StackExchange.using("externalEditor", function()
                      // Have to fire editor after snippets, if snippets enabled
                      if (StackExchange.settings.snippets.snippetsEnabled)
                      StackExchange.using("snippets", function()
                      createEditor();
                      );

                      else
                      createEditor();

                      );

                      function createEditor()
                      StackExchange.prepareEditor(
                      heartbeatType: 'answer',
                      autoActivateHeartbeat: false,
                      convertImagesToLinks: true,
                      noModals: true,
                      showLowRepImageUploadWarning: true,
                      reputationToPostImages: 10,
                      bindNavPrevention: true,
                      postfix: "",
                      imageUploader:
                      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                      allowUrls: true
                      ,
                      onDemand: true,
                      discardSelector: ".discard-answer"
                      ,immediatelyShowMarkdownHelp:true
                      );



                      );













                      draft saved

                      draft discarded


















                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f1539793%2fin-java-does-anyone-use-short-or-byte%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown

























                      10 Answers
                      10






                      active

                      oldest

                      votes








                      10 Answers
                      10






                      active

                      oldest

                      votes









                      active

                      oldest

                      votes






                      active

                      oldest

                      votes








                      up vote
                      33
                      down vote



                      accepted










                      They are used when programming for embedded devices that are short on memory or disk space. Such as appliances and other electronic devices.



                      Byte is also used in low level web programming, where you send requests to web servers using headers, etc.






                      share|improve this answer






















                      • That's why I don't see them, I'm never looking at appliance or electronics source code, cheers -- coming to think of it, that was the original intention of Java before it made waves in applets and then took off.
                        – non sequitor
                        Oct 8 '09 at 19:54










                      • but then again, isnt java too slow and too big for appliances? isnt c/c++ the norm there?
                        – J. K.
                        Dec 23 '15 at 16:58










                      • It is true to say that Java is used less in embedded devices, but still quite a bit. Java ME is quite small and ARM processors have a mode that can execute JVM bytecode directly. Garbage collection is not predictable though, so most real time applications would use lower level programming.
                        – mjaggard
                        Nov 22 '16 at 7:55










                      • "devices that are short on memory" — No pun intended.
                        – MC Emperor
                        Nov 12 at 10:57















                      up vote
                      33
                      down vote



                      accepted










                      They are used when programming for embedded devices that are short on memory or disk space. Such as appliances and other electronic devices.



                      Byte is also used in low level web programming, where you send requests to web servers using headers, etc.






                      share|improve this answer






















                      • That's why I don't see them, I'm never looking at appliance or electronics source code, cheers -- coming to think of it, that was the original intention of Java before it made waves in applets and then took off.
                        – non sequitor
                        Oct 8 '09 at 19:54










                      • but then again, isnt java too slow and too big for appliances? isnt c/c++ the norm there?
                        – J. K.
                        Dec 23 '15 at 16:58










                      • It is true to say that Java is used less in embedded devices, but still quite a bit. Java ME is quite small and ARM processors have a mode that can execute JVM bytecode directly. Garbage collection is not predictable though, so most real time applications would use lower level programming.
                        – mjaggard
                        Nov 22 '16 at 7:55










                      • "devices that are short on memory" — No pun intended.
                        – MC Emperor
                        Nov 12 at 10:57













                      up vote
                      33
                      down vote



                      accepted







                      up vote
                      33
                      down vote



                      accepted






                      They are used when programming for embedded devices that are short on memory or disk space. Such as appliances and other electronic devices.



                      Byte is also used in low level web programming, where you send requests to web servers using headers, etc.






                      share|improve this answer














                      They are used when programming for embedded devices that are short on memory or disk space. Such as appliances and other electronic devices.



                      Byte is also used in low level web programming, where you send requests to web servers using headers, etc.







                      share|improve this answer














                      share|improve this answer



                      share|improve this answer








                      edited Aug 12 '10 at 18:32

























                      answered Oct 8 '09 at 19:04









                      Shawn Mclean

                      28.4k79251383




                      28.4k79251383











                      • That's why I don't see them, I'm never looking at appliance or electronics source code, cheers -- coming to think of it, that was the original intention of Java before it made waves in applets and then took off.
                        – non sequitor
                        Oct 8 '09 at 19:54










                      • but then again, isnt java too slow and too big for appliances? isnt c/c++ the norm there?
                        – J. K.
                        Dec 23 '15 at 16:58










                      • It is true to say that Java is used less in embedded devices, but still quite a bit. Java ME is quite small and ARM processors have a mode that can execute JVM bytecode directly. Garbage collection is not predictable though, so most real time applications would use lower level programming.
                        – mjaggard
                        Nov 22 '16 at 7:55










                      • "devices that are short on memory" — No pun intended.
                        – MC Emperor
                        Nov 12 at 10:57

















                      • That's why I don't see them, I'm never looking at appliance or electronics source code, cheers -- coming to think of it, that was the original intention of Java before it made waves in applets and then took off.
                        – non sequitor
                        Oct 8 '09 at 19:54










                      • but then again, isnt java too slow and too big for appliances? isnt c/c++ the norm there?
                        – J. K.
                        Dec 23 '15 at 16:58










                      • It is true to say that Java is used less in embedded devices, but still quite a bit. Java ME is quite small and ARM processors have a mode that can execute JVM bytecode directly. Garbage collection is not predictable though, so most real time applications would use lower level programming.
                        – mjaggard
                        Nov 22 '16 at 7:55










                      • "devices that are short on memory" — No pun intended.
                        – MC Emperor
                        Nov 12 at 10:57
















                      That's why I don't see them, I'm never looking at appliance or electronics source code, cheers -- coming to think of it, that was the original intention of Java before it made waves in applets and then took off.
                      – non sequitor
                      Oct 8 '09 at 19:54




                      That's why I don't see them, I'm never looking at appliance or electronics source code, cheers -- coming to think of it, that was the original intention of Java before it made waves in applets and then took off.
                      – non sequitor
                      Oct 8 '09 at 19:54












                      but then again, isnt java too slow and too big for appliances? isnt c/c++ the norm there?
                      – J. K.
                      Dec 23 '15 at 16:58




                      but then again, isnt java too slow and too big for appliances? isnt c/c++ the norm there?
                      – J. K.
                      Dec 23 '15 at 16:58












                      It is true to say that Java is used less in embedded devices, but still quite a bit. Java ME is quite small and ARM processors have a mode that can execute JVM bytecode directly. Garbage collection is not predictable though, so most real time applications would use lower level programming.
                      – mjaggard
                      Nov 22 '16 at 7:55




                      It is true to say that Java is used less in embedded devices, but still quite a bit. Java ME is quite small and ARM processors have a mode that can execute JVM bytecode directly. Garbage collection is not predictable though, so most real time applications would use lower level programming.
                      – mjaggard
                      Nov 22 '16 at 7:55












                      "devices that are short on memory" — No pun intended.
                      – MC Emperor
                      Nov 12 at 10:57





                      "devices that are short on memory" — No pun intended.
                      – MC Emperor
                      Nov 12 at 10:57













                      up vote
                      21
                      down vote













                      The byte datatype is frequently used when dealing with raw data from a file or network connection, though it is mostly used as byte. The short and short types are often used in connection with GUIs and image processing (for pixel locations & image sizes), and in sound processing.



                      The primary reason for using byte or short is one of clarity. The program code states uncategorically that only 8 or 16 bits are to be used, and when you accidentally use a larger type (without an appropriate typecast) you get a compilation error. (Admittedly, this could also be viewed as a nuisance when writing the code ... but once again the presence of the typecasts flags the fact that there is truncation happening to the reader.)



                      You don't achieve any space saving by using byte or short in simple variables instead of int, because most Java implementations align stack variables and object members on word boundaries. However, primitive array types are handled differently; i.e. elements of boolean, byte, char and short arrays are byte aligned. But unless the arrays are large in size or large in number, they doesn't make any significant contribution to the app's overall memory usage.



                      So I guess that the main reason that developers don't use byte or short as much as you (a C developer?) might expect is that it really doesn't make much (or often any) difference. Java developers tend not to obsess over memory usage like old-school C developers did :-).






                      share|improve this answer






















                      • If you tried to obsess over memory usage, I think Java would drive you mad. However, as a Java Dev myself, it's nice to know that I'm 'doing my bit' so I always choose the most succinct type where possible. It's not only about memory usage, it's about clarity.
                        – Chris Hatton
                        Dec 31 '14 at 3:28






                      • 3




                        @ChrisHatton - Well "your bit" is probably wasted effort (or harmful) if you are concerned about memory usage and performance. Seriously, undirected micro-optimization is usually wasted effort. And I already mentioned the clarity issue.
                        – Stephen C
                        Dec 31 '14 at 3:33















                      up vote
                      21
                      down vote













                      The byte datatype is frequently used when dealing with raw data from a file or network connection, though it is mostly used as byte. The short and short types are often used in connection with GUIs and image processing (for pixel locations & image sizes), and in sound processing.



                      The primary reason for using byte or short is one of clarity. The program code states uncategorically that only 8 or 16 bits are to be used, and when you accidentally use a larger type (without an appropriate typecast) you get a compilation error. (Admittedly, this could also be viewed as a nuisance when writing the code ... but once again the presence of the typecasts flags the fact that there is truncation happening to the reader.)



                      You don't achieve any space saving by using byte or short in simple variables instead of int, because most Java implementations align stack variables and object members on word boundaries. However, primitive array types are handled differently; i.e. elements of boolean, byte, char and short arrays are byte aligned. But unless the arrays are large in size or large in number, they doesn't make any significant contribution to the app's overall memory usage.



                      So I guess that the main reason that developers don't use byte or short as much as you (a C developer?) might expect is that it really doesn't make much (or often any) difference. Java developers tend not to obsess over memory usage like old-school C developers did :-).






                      share|improve this answer






















                      • If you tried to obsess over memory usage, I think Java would drive you mad. However, as a Java Dev myself, it's nice to know that I'm 'doing my bit' so I always choose the most succinct type where possible. It's not only about memory usage, it's about clarity.
                        – Chris Hatton
                        Dec 31 '14 at 3:28






                      • 3




                        @ChrisHatton - Well "your bit" is probably wasted effort (or harmful) if you are concerned about memory usage and performance. Seriously, undirected micro-optimization is usually wasted effort. And I already mentioned the clarity issue.
                        – Stephen C
                        Dec 31 '14 at 3:33













                      up vote
                      21
                      down vote










                      up vote
                      21
                      down vote









                      The byte datatype is frequently used when dealing with raw data from a file or network connection, though it is mostly used as byte. The short and short types are often used in connection with GUIs and image processing (for pixel locations & image sizes), and in sound processing.



                      The primary reason for using byte or short is one of clarity. The program code states uncategorically that only 8 or 16 bits are to be used, and when you accidentally use a larger type (without an appropriate typecast) you get a compilation error. (Admittedly, this could also be viewed as a nuisance when writing the code ... but once again the presence of the typecasts flags the fact that there is truncation happening to the reader.)



                      You don't achieve any space saving by using byte or short in simple variables instead of int, because most Java implementations align stack variables and object members on word boundaries. However, primitive array types are handled differently; i.e. elements of boolean, byte, char and short arrays are byte aligned. But unless the arrays are large in size or large in number, they doesn't make any significant contribution to the app's overall memory usage.



                      So I guess that the main reason that developers don't use byte or short as much as you (a C developer?) might expect is that it really doesn't make much (or often any) difference. Java developers tend not to obsess over memory usage like old-school C developers did :-).






                      share|improve this answer














                      The byte datatype is frequently used when dealing with raw data from a file or network connection, though it is mostly used as byte. The short and short types are often used in connection with GUIs and image processing (for pixel locations & image sizes), and in sound processing.



                      The primary reason for using byte or short is one of clarity. The program code states uncategorically that only 8 or 16 bits are to be used, and when you accidentally use a larger type (without an appropriate typecast) you get a compilation error. (Admittedly, this could also be viewed as a nuisance when writing the code ... but once again the presence of the typecasts flags the fact that there is truncation happening to the reader.)



                      You don't achieve any space saving by using byte or short in simple variables instead of int, because most Java implementations align stack variables and object members on word boundaries. However, primitive array types are handled differently; i.e. elements of boolean, byte, char and short arrays are byte aligned. But unless the arrays are large in size or large in number, they doesn't make any significant contribution to the app's overall memory usage.



                      So I guess that the main reason that developers don't use byte or short as much as you (a C developer?) might expect is that it really doesn't make much (or often any) difference. Java developers tend not to obsess over memory usage like old-school C developers did :-).







                      share|improve this answer














                      share|improve this answer



                      share|improve this answer








                      edited Jun 21 '13 at 4:51

























                      answered Oct 9 '09 at 2:03









                      Stephen C

                      512k69560912




                      512k69560912











                      • If you tried to obsess over memory usage, I think Java would drive you mad. However, as a Java Dev myself, it's nice to know that I'm 'doing my bit' so I always choose the most succinct type where possible. It's not only about memory usage, it's about clarity.
                        – Chris Hatton
                        Dec 31 '14 at 3:28






                      • 3




                        @ChrisHatton - Well "your bit" is probably wasted effort (or harmful) if you are concerned about memory usage and performance. Seriously, undirected micro-optimization is usually wasted effort. And I already mentioned the clarity issue.
                        – Stephen C
                        Dec 31 '14 at 3:33

















                      • If you tried to obsess over memory usage, I think Java would drive you mad. However, as a Java Dev myself, it's nice to know that I'm 'doing my bit' so I always choose the most succinct type where possible. It's not only about memory usage, it's about clarity.
                        – Chris Hatton
                        Dec 31 '14 at 3:28






                      • 3




                        @ChrisHatton - Well "your bit" is probably wasted effort (or harmful) if you are concerned about memory usage and performance. Seriously, undirected micro-optimization is usually wasted effort. And I already mentioned the clarity issue.
                        – Stephen C
                        Dec 31 '14 at 3:33
















                      If you tried to obsess over memory usage, I think Java would drive you mad. However, as a Java Dev myself, it's nice to know that I'm 'doing my bit' so I always choose the most succinct type where possible. It's not only about memory usage, it's about clarity.
                      – Chris Hatton
                      Dec 31 '14 at 3:28




                      If you tried to obsess over memory usage, I think Java would drive you mad. However, as a Java Dev myself, it's nice to know that I'm 'doing my bit' so I always choose the most succinct type where possible. It's not only about memory usage, it's about clarity.
                      – Chris Hatton
                      Dec 31 '14 at 3:28




                      3




                      3




                      @ChrisHatton - Well "your bit" is probably wasted effort (or harmful) if you are concerned about memory usage and performance. Seriously, undirected micro-optimization is usually wasted effort. And I already mentioned the clarity issue.
                      – Stephen C
                      Dec 31 '14 at 3:33





                      @ChrisHatton - Well "your bit" is probably wasted effort (or harmful) if you are concerned about memory usage and performance. Seriously, undirected micro-optimization is usually wasted effort. And I already mentioned the clarity issue.
                      – Stephen C
                      Dec 31 '14 at 3:33











                      up vote
                      13
                      down vote













                      In a 64-bit processor, the registers are all 64-bit so if your local variable is assigned to a register and is a boolean, byte, short, char, int, float, double or long it doesn't use memory and doesn't save any resources.
                      Objects are 8-byte aligned so they always take up a multiple of 8-byte in memory. This means Boolean, Byte, Short, Character, Integer, Long , Float and Double, AtomicBoolean, AtomicInteger, AtomicLong, AtomicReference all use the same amount of memory.



                      As has been noted, short types are used for arrays and reading/writing data formats. Even then short is not used very often IMHO.



                      Its also worth noting that a GB cost about £80 in a server, so a MB is about 8 pence and a KB is about 0.008 pence. The difference between byte and long is about 0.00006 pence. Your time is worth more than that. esp if you ever have a bug which resulted from having a data type which was too small.






                      share|improve this answer






















                      • Very good reminder about the boundaries. The calculation was also fun, but I'm pretty sure int or long weren't made just for avoiding bugs. It's more of a habit thing.
                        – Charles Roberto Canato
                        Apr 30 '14 at 22:56














                      up vote
                      13
                      down vote













                      In a 64-bit processor, the registers are all 64-bit so if your local variable is assigned to a register and is a boolean, byte, short, char, int, float, double or long it doesn't use memory and doesn't save any resources.
                      Objects are 8-byte aligned so they always take up a multiple of 8-byte in memory. This means Boolean, Byte, Short, Character, Integer, Long , Float and Double, AtomicBoolean, AtomicInteger, AtomicLong, AtomicReference all use the same amount of memory.



                      As has been noted, short types are used for arrays and reading/writing data formats. Even then short is not used very often IMHO.



                      Its also worth noting that a GB cost about £80 in a server, so a MB is about 8 pence and a KB is about 0.008 pence. The difference between byte and long is about 0.00006 pence. Your time is worth more than that. esp if you ever have a bug which resulted from having a data type which was too small.






                      share|improve this answer






















                      • Very good reminder about the boundaries. The calculation was also fun, but I'm pretty sure int or long weren't made just for avoiding bugs. It's more of a habit thing.
                        – Charles Roberto Canato
                        Apr 30 '14 at 22:56












                      up vote
                      13
                      down vote










                      up vote
                      13
                      down vote









                      In a 64-bit processor, the registers are all 64-bit so if your local variable is assigned to a register and is a boolean, byte, short, char, int, float, double or long it doesn't use memory and doesn't save any resources.
                      Objects are 8-byte aligned so they always take up a multiple of 8-byte in memory. This means Boolean, Byte, Short, Character, Integer, Long , Float and Double, AtomicBoolean, AtomicInteger, AtomicLong, AtomicReference all use the same amount of memory.



                      As has been noted, short types are used for arrays and reading/writing data formats. Even then short is not used very often IMHO.



                      Its also worth noting that a GB cost about £80 in a server, so a MB is about 8 pence and a KB is about 0.008 pence. The difference between byte and long is about 0.00006 pence. Your time is worth more than that. esp if you ever have a bug which resulted from having a data type which was too small.






                      share|improve this answer














                      In a 64-bit processor, the registers are all 64-bit so if your local variable is assigned to a register and is a boolean, byte, short, char, int, float, double or long it doesn't use memory and doesn't save any resources.
                      Objects are 8-byte aligned so they always take up a multiple of 8-byte in memory. This means Boolean, Byte, Short, Character, Integer, Long , Float and Double, AtomicBoolean, AtomicInteger, AtomicLong, AtomicReference all use the same amount of memory.



                      As has been noted, short types are used for arrays and reading/writing data formats. Even then short is not used very often IMHO.



                      Its also worth noting that a GB cost about £80 in a server, so a MB is about 8 pence and a KB is about 0.008 pence. The difference between byte and long is about 0.00006 pence. Your time is worth more than that. esp if you ever have a bug which resulted from having a data type which was too small.







                      share|improve this answer














                      share|improve this answer



                      share|improve this answer








                      edited Aug 12 '10 at 19:41

























                      answered Oct 10 '09 at 9:34









                      Peter Lawrey

                      440k55557957




                      440k55557957











                      • Very good reminder about the boundaries. The calculation was also fun, but I'm pretty sure int or long weren't made just for avoiding bugs. It's more of a habit thing.
                        – Charles Roberto Canato
                        Apr 30 '14 at 22:56
















                      • Very good reminder about the boundaries. The calculation was also fun, but I'm pretty sure int or long weren't made just for avoiding bugs. It's more of a habit thing.
                        – Charles Roberto Canato
                        Apr 30 '14 at 22:56















                      Very good reminder about the boundaries. The calculation was also fun, but I'm pretty sure int or long weren't made just for avoiding bugs. It's more of a habit thing.
                      – Charles Roberto Canato
                      Apr 30 '14 at 22:56




                      Very good reminder about the boundaries. The calculation was also fun, but I'm pretty sure int or long weren't made just for avoiding bugs. It's more of a habit thing.
                      – Charles Roberto Canato
                      Apr 30 '14 at 22:56










                      up vote
                      5
                      down vote













                      I would most often use the short and byte types when working with binary formats and DataInput/DataOutput instances. If the spec says the next value is an 8bit or 16bit value and there's no value in promoting them to int (perhaps they're bit flags), they are an obvious choice.






                      share|improve this answer
























                        up vote
                        5
                        down vote













                        I would most often use the short and byte types when working with binary formats and DataInput/DataOutput instances. If the spec says the next value is an 8bit or 16bit value and there's no value in promoting them to int (perhaps they're bit flags), they are an obvious choice.






                        share|improve this answer






















                          up vote
                          5
                          down vote










                          up vote
                          5
                          down vote









                          I would most often use the short and byte types when working with binary formats and DataInput/DataOutput instances. If the spec says the next value is an 8bit or 16bit value and there's no value in promoting them to int (perhaps they're bit flags), they are an obvious choice.






                          share|improve this answer












                          I would most often use the short and byte types when working with binary formats and DataInput/DataOutput instances. If the spec says the next value is an 8bit or 16bit value and there's no value in promoting them to int (perhaps they're bit flags), they are an obvious choice.







                          share|improve this answer












                          share|improve this answer



                          share|improve this answer










                          answered Oct 8 '09 at 19:26









                          McDowell

                          93.8k23172245




                          93.8k23172245




















                              up vote
                              3
                              down vote













                              Arithmetic on bytes and shorts is more awkward than with ints. For example, if b1 and b2 are two byte variables, you can't write byte b3 = b1 + b2 to add them. This is because Java never does arithmetic internally in anything smaller than an int, so the expression b1 + b2 has type int even though it is only adding two byte values. You'd have to write byte b3 = (byte) (b1 + b2) instead.






                              share|improve this answer




















                              • I'm just learning java and this is something I initially struggled with. I couldn't understand why my book always used the int type. For example it didn't make sense to me to use the int type for number of days in a month. So I would change it in my code and would later run into issues.
                                – Jeffpowrs
                                Aug 17 '13 at 16:45














                              up vote
                              3
                              down vote













                              Arithmetic on bytes and shorts is more awkward than with ints. For example, if b1 and b2 are two byte variables, you can't write byte b3 = b1 + b2 to add them. This is because Java never does arithmetic internally in anything smaller than an int, so the expression b1 + b2 has type int even though it is only adding two byte values. You'd have to write byte b3 = (byte) (b1 + b2) instead.






                              share|improve this answer




















                              • I'm just learning java and this is something I initially struggled with. I couldn't understand why my book always used the int type. For example it didn't make sense to me to use the int type for number of days in a month. So I would change it in my code and would later run into issues.
                                – Jeffpowrs
                                Aug 17 '13 at 16:45












                              up vote
                              3
                              down vote










                              up vote
                              3
                              down vote









                              Arithmetic on bytes and shorts is more awkward than with ints. For example, if b1 and b2 are two byte variables, you can't write byte b3 = b1 + b2 to add them. This is because Java never does arithmetic internally in anything smaller than an int, so the expression b1 + b2 has type int even though it is only adding two byte values. You'd have to write byte b3 = (byte) (b1 + b2) instead.






                              share|improve this answer












                              Arithmetic on bytes and shorts is more awkward than with ints. For example, if b1 and b2 are two byte variables, you can't write byte b3 = b1 + b2 to add them. This is because Java never does arithmetic internally in anything smaller than an int, so the expression b1 + b2 has type int even though it is only adding two byte values. You'd have to write byte b3 = (byte) (b1 + b2) instead.







                              share|improve this answer












                              share|improve this answer



                              share|improve this answer










                              answered Aug 12 '10 at 19:26









                              Luke Woodward

                              44.3k126587




                              44.3k126587











                              • I'm just learning java and this is something I initially struggled with. I couldn't understand why my book always used the int type. For example it didn't make sense to me to use the int type for number of days in a month. So I would change it in my code and would later run into issues.
                                – Jeffpowrs
                                Aug 17 '13 at 16:45
















                              • I'm just learning java and this is something I initially struggled with. I couldn't understand why my book always used the int type. For example it didn't make sense to me to use the int type for number of days in a month. So I would change it in my code and would later run into issues.
                                – Jeffpowrs
                                Aug 17 '13 at 16:45















                              I'm just learning java and this is something I initially struggled with. I couldn't understand why my book always used the int type. For example it didn't make sense to me to use the int type for number of days in a month. So I would change it in my code and would later run into issues.
                              – Jeffpowrs
                              Aug 17 '13 at 16:45




                              I'm just learning java and this is something I initially struggled with. I couldn't understand why my book always used the int type. For example it didn't make sense to me to use the int type for number of days in a month. So I would change it in my code and would later run into issues.
                              – Jeffpowrs
                              Aug 17 '13 at 16:45










                              up vote
                              2
                              down vote













                              I used short extensively when creating an emulator based on a 16-bit architecture. I considered using char so I could have stuff unsigned but the spirit of using a real integer type won out in the end.



                              edit: regarding the inevitable question about what I did when I needed the most significant bit: with the thing I was emulating it happened to almost never get used. In the few places it was used, I just used bitwise modifiers or math hackery.






                              share|improve this answer


























                                up vote
                                2
                                down vote













                                I used short extensively when creating an emulator based on a 16-bit architecture. I considered using char so I could have stuff unsigned but the spirit of using a real integer type won out in the end.



                                edit: regarding the inevitable question about what I did when I needed the most significant bit: with the thing I was emulating it happened to almost never get used. In the few places it was used, I just used bitwise modifiers or math hackery.






                                share|improve this answer
























                                  up vote
                                  2
                                  down vote










                                  up vote
                                  2
                                  down vote









                                  I used short extensively when creating an emulator based on a 16-bit architecture. I considered using char so I could have stuff unsigned but the spirit of using a real integer type won out in the end.



                                  edit: regarding the inevitable question about what I did when I needed the most significant bit: with the thing I was emulating it happened to almost never get used. In the few places it was used, I just used bitwise modifiers or math hackery.






                                  share|improve this answer














                                  I used short extensively when creating an emulator based on a 16-bit architecture. I considered using char so I could have stuff unsigned but the spirit of using a real integer type won out in the end.



                                  edit: regarding the inevitable question about what I did when I needed the most significant bit: with the thing I was emulating it happened to almost never get used. In the few places it was used, I just used bitwise modifiers or math hackery.







                                  share|improve this answer














                                  share|improve this answer



                                  share|improve this answer








                                  edited Sep 21 '10 at 2:07

























                                  answered Sep 21 '10 at 2:02









                                  Dinah

                                  25.5k28118139




                                  25.5k28118139




















                                      up vote
                                      1
                                      down vote













                                      I think in most applications short has no domain meaning, so it makes more sense to use Integer.






                                      share|improve this answer
























                                        up vote
                                        1
                                        down vote













                                        I think in most applications short has no domain meaning, so it makes more sense to use Integer.






                                        share|improve this answer






















                                          up vote
                                          1
                                          down vote










                                          up vote
                                          1
                                          down vote









                                          I think in most applications short has no domain meaning, so it makes more sense to use Integer.






                                          share|improve this answer












                                          I think in most applications short has no domain meaning, so it makes more sense to use Integer.







                                          share|improve this answer












                                          share|improve this answer



                                          share|improve this answer










                                          answered Oct 8 '09 at 19:03









                                          C. Ross

                                          17.8k34124217




                                          17.8k34124217




















                                              up vote
                                              1
                                              down vote













                                              short and others are often used for storing image data. Note that it is the number of bits which is really important, not the arithmetic properties (which just cause promotion to int or better.



                                              short is also used as array indexes in JavaCard (1.0 and 2.0, IIRC, but not 3.0 which also has an HTTP stack and web services).






                                              share|improve this answer
























                                                up vote
                                                1
                                                down vote













                                                short and others are often used for storing image data. Note that it is the number of bits which is really important, not the arithmetic properties (which just cause promotion to int or better.



                                                short is also used as array indexes in JavaCard (1.0 and 2.0, IIRC, but not 3.0 which also has an HTTP stack and web services).






                                                share|improve this answer






















                                                  up vote
                                                  1
                                                  down vote










                                                  up vote
                                                  1
                                                  down vote









                                                  short and others are often used for storing image data. Note that it is the number of bits which is really important, not the arithmetic properties (which just cause promotion to int or better.



                                                  short is also used as array indexes in JavaCard (1.0 and 2.0, IIRC, but not 3.0 which also has an HTTP stack and web services).






                                                  share|improve this answer












                                                  short and others are often used for storing image data. Note that it is the number of bits which is really important, not the arithmetic properties (which just cause promotion to int or better.



                                                  short is also used as array indexes in JavaCard (1.0 and 2.0, IIRC, but not 3.0 which also has an HTTP stack and web services).







                                                  share|improve this answer












                                                  share|improve this answer



                                                  share|improve this answer










                                                  answered Oct 8 '09 at 20:21









                                                  Tom Hawtin - tackline

                                                  125k28179266




                                                  125k28179266




















                                                      up vote
                                                      1
                                                      down vote













                                                      byte happens all the time; buffers, specifically for networks, files, graphics, serialization, etc.






                                                      share|improve this answer
























                                                        up vote
                                                        1
                                                        down vote













                                                        byte happens all the time; buffers, specifically for networks, files, graphics, serialization, etc.






                                                        share|improve this answer






















                                                          up vote
                                                          1
                                                          down vote










                                                          up vote
                                                          1
                                                          down vote









                                                          byte happens all the time; buffers, specifically for networks, files, graphics, serialization, etc.






                                                          share|improve this answer












                                                          byte happens all the time; buffers, specifically for networks, files, graphics, serialization, etc.







                                                          share|improve this answer












                                                          share|improve this answer



                                                          share|improve this answer










                                                          answered Aug 12 '10 at 19:46









                                                          Dean J

                                                          21k135390




                                                          21k135390




















                                                              up vote
                                                              0
                                                              down vote













                                                              Most of the time, there's never a real good technical reason for a developer (Java, C#, BASIC, etc.) to decide for an int, short or byte - when the capacity is enough, of course. If the value will be under 2 billion then int it will be.



                                                              Are you sure we'll have people older than 255? Well, you never know!



                                                              Aren't 32,767 possible countries enough? Don't think too small!



                                                              In your example, you can be perfectly happy with your byte var containing 100, if you are absolutely sure than it will NEVER overflow. Why do guys use int the most? Because.... because.



                                                              This is one of those things that most of us just do because we saw it that way most of the time, and never asked differently.



                                                              Of course, I have nothing against "all things int". I just prefer to use the right type for each kind of value, no stress involved.






                                                              share|improve this answer
























                                                                up vote
                                                                0
                                                                down vote













                                                                Most of the time, there's never a real good technical reason for a developer (Java, C#, BASIC, etc.) to decide for an int, short or byte - when the capacity is enough, of course. If the value will be under 2 billion then int it will be.



                                                                Are you sure we'll have people older than 255? Well, you never know!



                                                                Aren't 32,767 possible countries enough? Don't think too small!



                                                                In your example, you can be perfectly happy with your byte var containing 100, if you are absolutely sure than it will NEVER overflow. Why do guys use int the most? Because.... because.



                                                                This is one of those things that most of us just do because we saw it that way most of the time, and never asked differently.



                                                                Of course, I have nothing against "all things int". I just prefer to use the right type for each kind of value, no stress involved.






                                                                share|improve this answer






















                                                                  up vote
                                                                  0
                                                                  down vote










                                                                  up vote
                                                                  0
                                                                  down vote









                                                                  Most of the time, there's never a real good technical reason for a developer (Java, C#, BASIC, etc.) to decide for an int, short or byte - when the capacity is enough, of course. If the value will be under 2 billion then int it will be.



                                                                  Are you sure we'll have people older than 255? Well, you never know!



                                                                  Aren't 32,767 possible countries enough? Don't think too small!



                                                                  In your example, you can be perfectly happy with your byte var containing 100, if you are absolutely sure than it will NEVER overflow. Why do guys use int the most? Because.... because.



                                                                  This is one of those things that most of us just do because we saw it that way most of the time, and never asked differently.



                                                                  Of course, I have nothing against "all things int". I just prefer to use the right type for each kind of value, no stress involved.






                                                                  share|improve this answer












                                                                  Most of the time, there's never a real good technical reason for a developer (Java, C#, BASIC, etc.) to decide for an int, short or byte - when the capacity is enough, of course. If the value will be under 2 billion then int it will be.



                                                                  Are you sure we'll have people older than 255? Well, you never know!



                                                                  Aren't 32,767 possible countries enough? Don't think too small!



                                                                  In your example, you can be perfectly happy with your byte var containing 100, if you are absolutely sure than it will NEVER overflow. Why do guys use int the most? Because.... because.



                                                                  This is one of those things that most of us just do because we saw it that way most of the time, and never asked differently.



                                                                  Of course, I have nothing against "all things int". I just prefer to use the right type for each kind of value, no stress involved.







                                                                  share|improve this answer












                                                                  share|improve this answer



                                                                  share|improve this answer










                                                                  answered Apr 30 '14 at 23:12









                                                                  Charles Roberto Canato

                                                                  17829




                                                                  17829



























                                                                      draft saved

                                                                      draft discarded
















































                                                                      Thanks for contributing an answer to Stack Overflow!


                                                                      • Please be sure to answer the question. Provide details and share your research!

                                                                      But avoid


                                                                      • Asking for help, clarification, or responding to other answers.

                                                                      • Making statements based on opinion; back them up with references or personal experience.

                                                                      To learn more, see our tips on writing great answers.





                                                                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                                                                      Please pay close attention to the following guidance:


                                                                      • Please be sure to answer the question. Provide details and share your research!

                                                                      But avoid


                                                                      • Asking for help, clarification, or responding to other answers.

                                                                      • Making statements based on opinion; back them up with references or personal experience.

                                                                      To learn more, see our tips on writing great answers.




                                                                      draft saved


                                                                      draft discarded














                                                                      StackExchange.ready(
                                                                      function ()
                                                                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f1539793%2fin-java-does-anyone-use-short-or-byte%23new-answer', 'question_page');

                                                                      );

                                                                      Post as a guest















                                                                      Required, but never shown





















































                                                                      Required, but never shown














                                                                      Required, but never shown












                                                                      Required, but never shown







                                                                      Required, but never shown

































                                                                      Required, but never shown














                                                                      Required, but never shown












                                                                      Required, but never shown







                                                                      Required, but never shown







                                                                      Popular posts from this blog

                                                                      𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

                                                                      Crossroads (UK TV series)

                                                                      ữḛḳṊẴ ẋ,Ẩṙ,ỹḛẪẠứụỿṞṦ,Ṉẍừ,ứ Ị,Ḵ,ṏ ṇỪḎḰṰọửḊ ṾḨḮữẑỶṑỗḮṣṉẃ Ữẩụ,ṓ,ḹẕḪḫỞṿḭ ỒṱṨẁṋṜ ḅẈ ṉ ứṀḱṑỒḵ,ḏ,ḊḖỹẊ Ẻḷổ,ṥ ẔḲẪụḣể Ṱ ḭỏựẶ Ồ Ṩ,ẂḿṡḾồ ỗṗṡịṞẤḵṽẃ ṸḒẄẘ,ủẞẵṦṟầṓế