The voltage of a power supply is the difference in electrical potential between its output terminals, which determines the amount of electrical force available to pass current through connected devices. Power supplies can have different output voltages, such as 5V, 12V or 24V, depending on their design and application.
Output voltage is crucial for matching the power supply to the voltage requirements of electronic components or systems.
To determine whether your power source is 110V or 220V, you can check the specifications label on the power source itself, which usually indicates the input voltage range. Additionally, you can use a multimeter to measure the voltage at the power outlet. In many countries, 110 V is used for standard household appliances, while 220 V is common in other regions and for high-power equipment.
Source voltage refers to the input voltage supplied to a power supply.
This is the voltage that the power supply receives from the mains or other electrical source before being converted to the desired output voltage. For example, if a power supply is designed to convert 220 V AC input to 12 V DC output, the source voltage is 220 V AC.
A PC power supply’s voltage typically includes multiple outputs to accommodate different components. Common voltages provided by a PC power supply include 3.3V, 5V, and 12V DC.
These voltages are used to power various parts of the computer, such as the motherboard, storage devices, and peripherals.
The type of voltage that powers a power supply is usually alternating current (alternating current) from the mains power grid. This input voltage is then converted by the power supply to one or more DC (direct current) voltages, depending on the requirements of the connected devices. Input voltage may vary, with common values being 110V or 220V AC, depending on regional standards.