I need to write a machine-language program to input two one-digit numbers, add them, and output the one-digit sum. The object code must be in hexadecimal. This is strictly machine language, not assembly! This is done using the pep8 simulator. This is what I have so far, but all I get is 0 no matter what two integers I have in the input. Not sure if it's all wrong, or if I just have the bit locations incorrect. Help would be much appreciated! 0000 3100FE ;Get input from the user and store it @ 00FE 0003 3100FF ;Get input from the user and store it @ 00FF 0006 F100FA ;Store second decimal THIS IS A CHARACTER? 0009 D100FE ;Load first input into memory 000C 7100FA ;Add two numbers 000F A1001B ;convert sum to character 0012 F10019 ;store the character 0015 510019 ;output the character 0018 00 ; stop 0019 00 ; character to output 001B 0030 ; convert mask to ASCII from decimal 001D 000F ; convert mask to decimal from ASCII 31 00 FE 31 00 FF F1 00 FA D1 00 FE 71 00 FA A1 00 1A F1 00 19 51 00 19 00 00 00 30 00 0F zz ; gets 0
I need to write a machine-language
This is what I have so far, but all I get is 0 no matter what two integers I have in the input. Not sure if it's all wrong, or if I just have the bit locations incorrect. Help would be much appreciated!
0000 3100FE ;Get input from the user and store it @ 00FE
0003 3100FF ;Get input from the user and store it @ 00FF
0006 F100FA ;Store second decimal THIS IS A CHARACTER?
0009 D100FE ;Load first input into memory
000C 7100FA ;Add two numbers
000F A1001B ;convert sum to character
0012 F10019 ;store the character
0015 510019 ;output the character
0018 00 ; stop
0019 00 ; character to output
001B 0030 ; convert mask to ASCII from decimal
001D 000F ; convert mask to decimal from ASCII
31 00 FE 31 00 FF F1 00 FA D1 00 FE 71 00 FA A1 00 1A F1 00 19 51 00 19 00 00 00 30 00 0F zz ; gets 0
Trending now
This is a popular solution!
Step by step
Solved in 2 steps