About 12,200,000 results
Open links in new tab
  1. .net - What is a byte [] array? - Stack Overflow

    Jun 28, 2015 · 21 In .NET, a byte is basically a number from 0 to 255 (the numbers that can be represented by eight bits). So, a byte array is just an array of the numbers 0 - 255. At a lower …

  2. How do I initialize a byte array in Java? - Stack Overflow

    Jun 26, 2012 · I have to store some constant values (UUIDs) in byte array form in java, and I'm wondering what the best way to initialize those static arrays would be. This is how I'm currently …

  3. c# - byte [] to hex string - Stack Overflow

    Mar 8, 2009 · How do I convert a byte[] to a string? Every time I attempt it, I get System.Byte[] instead of the value. Also, how do I get the value in Hex instead of a decimal?

  4. How to convert byte[] to Byte[] and the other way around?

    Mar 23, 2020 · How to convert byte[] to Byte[] and also Byte[] to byte[], in the case of not using any 3rd party library? Is there a way to do it fast just using the standard library?

  5. java - Byte [] to InputStream or OutputStream - Stack Overflow

    Jan 19, 2010 · I have a blob column in my database table, for which I have to use byte[] in my Java program as a mapping and to use this data I have to convert it to InputStream or …

  6. byte - Java: Convert String to packed decimal - Stack Overflow

    Do you want the packed decimal as a byte array, like your input, or do you want the actual n bytes as an actual packed decimal, which would be a String in Java?

  7. c# - Why does byte + byte = int? - Stack Overflow

    long + long = long float + float = float double + double = double So why not: byte + byte = byte short + short = short? A bit of background: I am performing a long list of calculations on "small …

  8. UnicodeDecodeError, invalid continuation byte - Stack Overflow

    Dec 11, 2016 · Latin-1 is a single byte encoding family so everything in it should be defined in UTF-8. But why sometime Latin-1 wins?

  9. How to fix: "UnicodeDecodeError: 'ascii' codec can't decode byte"

    UnicodeDecodeError: 'ascii' codec can't decode byte generally happens when you try to convert a Python 2.x str that contains non-ASCII to a Unicode string without specifying the encoding of …

  10. How to solve UnicodeDecodeError: 'utf-8' codec can't decode byte …

    Apr 7, 2019 · UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte Please see my screenshot here: I don't know either how to save the original data without …