UNICODE number
Unicode is a character encoding well known used in conversation structures and computer systems. Unlike ASCII, which uses 7-bit encoding and encodes 128 unique characters (zero-127), Unicode makes use of variable-period encoding to symbolize a full-size variety of characters from numerous scripts and languages. Unicode can represent over 1,000,000 distinct characters, making it a comprehensive character encoding standard.
Examples:
Input: Unicode = ‘A’
Output: ASCII = 65Input: Unicode = ‘Z’
Output: ASCII = 90
Approach: Follow the below steps to convert unicode to ASCII number
- Declare a string variable unicodeInput and initialize it with the Unicode character “A”.
- Call the unicodeToAscii function with unicodeInput as an argument.
- Define the unicodeToAscii function that takes a string unicodeNum as a parameter.
- In the try block:
- Extract the first character from the unicodeNum string.
- Convert the Unicode character to ASCII by casting it to an integer.
- Return the ASCII value.
- In the catch block:
- Handle StringIndexOutOfBoundsException by printing an error message with the exception’s message.
- Return -1 to indicate an error.
- In the main function:
- Check if the result from unicodeToAscii is not equal to -1.
- If true, print the Unicode input and the corresponding ASCII output.
- The program outputs the Unicode character “A” and its corresponding ASCII value.
Following is the code to get the ASCII value of a given character.
C++
#include <iostream> #include <stdexcept> class UnicodeToAsciiCpp { public : // Convert Unicode character to ASCII code static int UnicodeToAscii(std::string unicodeNum) { try { // Extract the first character from the Unicode string char unicodeChar = unicodeNum[0]; // Convert the Unicode character to ASCII code int asciiNum = static_cast < int >(unicodeChar); return asciiNum; } catch ( const std::out_of_range& e) { // Handle the exception if the string is empty std::cerr << "Error: " << e.what() << std::endl; return -1; } } }; int main() { // Input Unicode character std::string unicodeInput = "A" ; // Call the method to convert Unicode to ASCII int asciiOutputCpp = UnicodeToAsciiCpp::UnicodeToAscii(unicodeInput); // Check if conversion was successful if (asciiOutputCpp != -1) { // Output Unicode and ASCII values std::cout << "Unicode: " << unicodeInput << std::endl; std::cout << "ASCII: " << asciiOutputCpp << std::endl; } return 0; } // This code is contributed by shivamgupta310570 |
Java
public class UnicodeToAsciiJava { public static int unicodeToAscii(String unicodeNum) { try { char unicodeChar = unicodeNum.charAt( 0 ); int asciiNum = ( int )unicodeChar; return asciiNum; } catch (StringIndexOutOfBoundsException e) { System.out.println("Error: " + e.getMessage()); return - 1 ; } } public static void main(String[] args) { String unicodeInput = "A"; int asciiOutputJava = unicodeToAscii(unicodeInput); if (asciiOutputJava != - 1 ) { System.out.println("Unicode: " + unicodeInput); System.out.println("ASCII: " + asciiOutputJava); } } } |
Python3
def unicode_to_ascii_py(unicode_num): try : ascii_num = ord (unicode_num) return ascii_num except TypeError as e: print (f"Error: {e}") return None # Example usage unicode_input = 'A' ascii_output_py = unicode_to_ascii_py(unicode_input) if ascii_output_py is not None : print (f" Unicode : {unicode_input}") print (f"ASCII: {ascii_output_py}") |
C#
using System; public class UnicodeToAsciiCSharp { // Convert Unicode character to ASCII code public static int UnicodeToAscii( string unicodeNum) { try { // Extract the first character from the Unicode string char unicodeChar = unicodeNum[0]; // Convert the Unicode character to ASCII code int asciiNum = ( int )unicodeChar; return asciiNum; } catch (IndexOutOfRangeException e) { // Handle the exception if the string is empty Console.WriteLine( "Error: " + e.Message); return -1; } } public static void Main() { // Input Unicode character string unicodeInput = "A" ; // Call the method to convert Unicode to ASCII int asciiOutputCSharp = UnicodeToAscii(unicodeInput); // Check if conversion was successful if (asciiOutputCSharp != -1) { // Output Unicode and ASCII values Console.WriteLine( "Unicode: " + unicodeInput); Console.WriteLine( "ASCII: " + asciiOutputCSharp); } } } // This code is contributed by akshitaguprzj3 |
Javascript
function unicodeToAsciiJs(unicodeNum) { try { const asciiNum = unicodeNum.charCodeAt(0); return asciiNum; } catch (error) { console.error("Error:", error.message); return null ; } } // Example usage const unicodeInput = 'A' ; const asciiOutputJs = unicodeToAsciiJs(unicodeInput); if (asciiOutputJs !== null ) { console.log(`Unicode: ${unicodeInput}`); console.log(`ASCII: ${asciiOutputJs}`); } |
Unicode: A ASCII: 65
Time Complexity: O(1)
Auxiliary Space: O(1)
Program to Convert Unicode to ASCII
Given a Unicode number, the task is to convert this into an ASCII (American Standard Code for Information Interchange) number.