Radio Wave Intensity and relationship to Power *URGENT*

Ask questions about projects relating to: aerodynamics or hydrodynamics, astronomy, chemistry, electricity, electronics, physics, or engineering

Moderators: kgudger, bfinio, MadelineB, Moderators

Locked
Lukemorem2356
Posts: 4
Joined: Fri Feb 21, 2020 8:49 pm
Occupation: Student

Radio Wave Intensity and relationship to Power *URGENT*

Post by Lukemorem2356 »

Hello,

I am a 8th grader taking 9th grade physical science in South Florida. I am doing a very complex science "experiment", more or less personal project being done on my own time, in which I will be reverse engineering a piece of equipment that uses and generates and radio waves.

I would like to know that if you were to take for example an antenna that resonates at 146 MHz, could the voltage be changed reducing in either an increase or decrease in the intensity of the radio signal.

The experiment I am doing requires that I use radio equipment to do this but also, that I must increase or decrease the amount of electrical power. If I am using a radio transmitter that sends the signal through a cable to the antenna meaning the electricity is on dc current, than could I basically add more intensity to the signal by increasing the electrical input into the same cable as well intensifying the cable?

I had somewhat of a theoretical question regarding the relationship between radio fields and electricity. In a hypothetical scenario, I had a transmitter and it was sending a signal to an antenna which resonated at 146 MHz and I decided to add more voltage to the cable from a separate power source other than the transmitter itself, would that increase the intensity more or less for than antenna?

Thank you for all your help and input on this question.

Moderator note: I've combined 1 of your more recent posts with this post. I've removed the 2nd one. Science Buddies' guidelines requests that you only post each question once. Please be patient waiting for a response -- the experts here are volunteers and, as such, do not monitor these forums 24/7. Thank you!
rmarz
Expert
Posts: 634
Joined: Sat Oct 25, 2008 1:26 pm
Occupation: Technology Consultant
Project Question: n/a
Project Due Date: n/a
Project Status: Not applicable

Re: Radio Wave Intensity and relationship to Power *URGENT*

Post by rmarz »

Lukemorem2356 - I don't know exactly what you mean by changing voltage to increase or decrease RF power to the antenna. I'll assume you are asking whether changing the input voltage to the transmitter will see a noticeable change in RF power. For example, say the normal power to the transmitter is from a 12 volt DC source. Would lowering the voltage to 11 volts, or increasing the voltage to 13 volts result in a measurable change. The answer is 'not necessarily'. There is likely voltage regulation circuitry within the transmitter to allow for good performance over a small change in supply voltage. Certainly if you decrease voltage substantially, at some point output RF power will be reduced and ultimately will cease. A substantial increase in voltage may increase RF output power, but possibly will damage the transmitter at some point. The 146 mHz band is in the middle of the 2 Meter Amateur radio band, so I assume this is the equipment you are using. Caution - experimenting in this band may violate FCC rules.

You asked if adding DC power to the cable feeding the antenna would increase RF power to the antenna. The answer to that is simply no. In fact, RF power radiated by the antenna would probably be reduced as the 146 mHZ RF energy would likely be shunted by the low internal resistance DC power supply.

A typical commercial 2 Meter transmitter often allows the operator to switch power from 1 watt or 5 watt power, or something in between. Could you tell us more about your experiment? If you are building your own transmitter, perhaps there is another way to reduce RF power output to the antenna.

Rick Marz (KD6EFB)
Locked

Return to “Grades 9-12: Physical Science”