Author Topic: TED 5000 data logging and analysis with GNU/Linux  (Read 32653 times)

iteration69

  • Newbie
  • *
  • Posts: 26
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #15 on: August 15, 2011, 06:58:52 AM »
GAR:

With any luck I'll have those bad graphs corrected. I'm currently side tracked but do intend to fix the graphs.

Understand that each home is different. The energy use is an effective profile of those who occupy the home as well as the appliances and their effect. The profile also indicates what type of people they are. If I were to show a plot of a friends home you would probably call BS. But you must understand the underlying factors which make up the profile. We both have electric stoves, microwaves, and conventional refrigerators, freezorer, and until a few weeks ago, conventional hot water heaters.

First lets consider my own home and it's occupants. I work monday - friday primarly 3rd shit. My wife only works weekends, 2nd shift. I'm home on the week end with my daughter using much more energy than I do through the week. So my wife and daughter are home all day through the week using whatever you may expect them to use. I get home when they are sleeping and I use a given amount of energy. Then I go to sleep about the time they are waking up. Then I get up (roughly noon). So there is nearly a continuous envelope of energy use. -- This is certainly not normal and you should not expect to understand the use (albeit a simple hot water profile, or complicated net analysis) without first understanding the underlying natures.

Now, if I were to share a plot of a friends energy use we would see, literally, 1/20th of my peak month(which i've not shared yet). This is because my friend and his wife have no children and are both vegetarian so they do not tend to use as much power. Nor do they have central air in their home, nor does he have an additional building with central air. But as I stated, I have not shared those graphs with good reason.

I could easily share second data for the Voltex, but it's very noisy and subjective. Not quite as useful for load balancing. Second data is very useful for trend data though. I try to use the most representative graph as possible, and that's obviously subjective too. I suppose this boils down to something as simple as everyone needs to be on the same page, which I'm learning may be a difficult thing to do.

I assume you have read how my log works? (it is a prerequisite to everything I do), and that I am logging, net, heat pump, hot water, and my building --The building itself has central air, and numerous electronic equipment. For my friend:  Net, hot water, refrigerator, freezer.  And then he has a second ted with DUT1-4,  (Device under test). 

Our intentions are to understand every aspect of our energy use. That is from noise up to replacement designs for off grid.  Some aspects are to determine the contributing energy consumers while others, such as the heat pump hot water heater -- are both cross analysis, verification of what the manufacturer claims, off grid, equipment trend, and general research. You obviously understand that no single graph can paint such a complicated picture, and in the same token for me to share every single graph -- I'd have to write a book. So please bear with me if you don't understand why I decided to use a specific graph.

No matter how many times I double and tripple check, I'm bound to have some typos or mislabeling and I will greatly appreciate your critique.

Can you share any plots?
 
To be continued...

iteration69

  • Newbie
  • *
  • Posts: 26
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #16 on: August 15, 2011, 10:42:54 AM »
I started to regenerate those graphs and was thinking about my previous statement regarding a single graph when I realized that even though I can't correlate second,min,hour,and day on a single graph -- I can add several curve fits to the entire set of a given device.

So here goes.


The algorithm used for the moving average was arithmetic mean using a sub-set of 7 days, the log fit is a standard fitting algorithm of:
y=a+b*ln(sign*(x-c))

The series itself was pulled from the daily.

More to come later...


GAR

  • Full Member
  • ***
  • Posts: 131
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #17 on: August 15, 2011, 06:01:33 PM »
110815-0954 EDT

iteration69:

I will provide some plots after I solve a problem getting into my web site.

Do you plan or expect to need to operate without grid power?

.

GAR

  • Full Member
  • ***
  • Posts: 131
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #18 on: August 16, 2011, 09:17:46 PM »
110816-1309 EDT

iteration69:

You can access my plots at http://beta-a2.com/EE-photos.html .

Go to the end and look at plots P26, P27, and P28.

.

GAR

  • Full Member
  • ***
  • Posts: 131
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #19 on: August 19, 2011, 10:19:57 PM »
110819-1334 EDT

iteration69:

If I were trying to compare your two water heaters I would first do the following before any statistical analysis.

Measure two aspects of water heater operation.
1. Measure the input electrical energy to raise 1 gallon of water from T1 to T2. Where T1 is the input temperature of the water to the tank, and T2 is the desired tank temperature.
2. Measure the rate of power loss when the water in the tank is at T2.

If my calculations are correct it takes about 3956 watt-seconds to raise 1 quart of water 1 deg C. Thus, 4.396 / 1000 KWH per gallon per deg C.

For a 50 gallon tank and a rise of 75 deg F this is 50 * 4.396 * 75 * 5 / (9 * 1000) KWH. The result is 9.158 KWH, and at my rate of 0.13 the cost is $ 1.19 . This would be my cost per shower using electricity. But I use gas so it is about $ 0.40 .

When I ran a test on a 1 quart hot pot the measured input energy to heat the water was very close to the theoretical value. Then I also ran the test in a microwave oven and the required energy was about 2.25 times that of the hot pot.

Once the hot water tank is up to T2, and without removing any water from the tank, then measure the energy input over possibly 10 on-off heat cycles. Measure the total time of the test starting with the first on time, and end with the start of the N+1 cycle. Calculate the average power. This is the rate of heat loss.

If both tanks are not the same size, then scale the power loss of the smaller tank by "larger tank volume" / "smaller tank volume" to make a relative comparison.

In actual daily use the performance of the resistance heated tank can be predicted from the above information and gallons of water used.

For the heat pump water heater its characteristics may change depending upon the change of temperature within the tank under normal usage. Thus, statistical analysis of its power input maybe necessary and you will need to know the amount of water used over the test period.

Another factor in the cost of the heat pump heater is the cost of the energy to the air to which the heat pump sinks. If it is outside air, then this aspect is not a cost.

.

iteration69

  • Newbie
  • *
  • Posts: 26
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #20 on: September 04, 2011, 08:28:12 AM »
GAR.

There is a problem. you and I do not have enough information regarding one another to establish anything concrete here. This will be a very lengthy post as I intend to fill some of the gaps that make me "just another guy posting on a forum" and who I am, or more specifically what I am capable of. With any luck, you can do the same and we may be able establish a grounds in which we can share detailed information regarding our specific experiments.

First let me state that my profession (it just so happens that my profession and personal interests require the same disciplines) is I repair laboratory instrumentation. That is, mass spectrometers, gas chromatography, total organic carbon analyzers, sulfur analyzers, ion chromatography, calorimeters, inductively coupled plasma torches- just to name of few. Mind you, this position is not bells and whistles. I don't have the luxury of a repair manual with signal sources and test points. I reverse engineer everything I work on before I can find the problem and render a repair. Prior to the repair position I was a technician in energy analysis (about 4 years) That is geo chemical energy (oil, coal, natural gas). And prior to that I was an embedded systems engineer (15 years experience) That is electronics design, firmware development, pcb design, routing, testing ( nail of beds algorithms) and assembly. Our efforts focused on highly advanced refrigeration controls that were capable of predicting equipment failure and automatic limp modes to extend the operating time of equipment that would have otherwise failed. Later I applied my research to HVAC/R. I ended up getting out of embedded controls and switched gears to laboratory work. I maintain control of all the previous technology we had worked on, the building, equipment, etc.

The work that I do now, repairs, analysis, and design is all traceable ad audited by major standards organizations such as NIST, ASTM, and NELAC. Mind you this is the work I perform for a company, not myself. Obviously the tools and methods I have access to while at work are very much different that what I have access to at home. In some cases I have more equipment at home (as far as engineering and testing equipment) Other cases such as analysis, I am much better suited at work.  As the point of this thread is analysis using linux, and the fact that I use linux at home, I'm simply sharing what I am currently capable of while using linux at home. While the theoretical limit of linux is no different than any other major operating system there is however a limit as to how much time I can invest in to a particular analytical solution with the tools at hand. At the same time I hope that someone else may be able to benefit from some of my work. With this is mind I tend to lame down, or filter a great deal of my experiences in most forums posts so that more people may benefit. After all, we can't expect everyone to have access to elaborate equipment, prerequisites of certain theories, or even a given level of experience.

In short, nearly everything you have brought to light has already been considered and neglected for this very reason.

This is a good thing. In fact it is better than good it is great! To me this means that you are capable of something that I have found most people to lack. That is, thought.  :)


Now with all that out of the way, your rough ideas by using water to determine energy are not going to yield  the results you may expect( or perhaps better worded, that /I/ would expect).  -- This is a bold statement and a good reason and justification of my mentioning of experience. This way you have an idea that I may actually know what I'm talking about-- as apposed to simply googling a given subject and becoming a self proclaimed guru in ten seconds. For all those google-gurus, I own books you will never know existed. But I digress.

The subject of calorimetry requires tight control of thermal mass, thermal tanking, and heat exchange (loss and gain). All modes must be considered (radiant, convection, conduction) As well as phase change of material and meso scale macro changes to molecules. Not to neglect common calorimetry modes such as isoperibol and adiabatic. Another factor is the water itself.  As water quality changes so does the specific heat capacity. Very specific conditions must be maintained. I would have to write a book to explain the details and this is far beyond the scope of my free time and yours.

For a primer I refer you to what I would consider an introduction to the theory of calorimetry and associated methods.
Theory of Calorimetry ISBN 1-4020-0797-3

Note that this book will not cover the details of: phase changes, meso scale macro changes, nor non newtonian materials and systems. Those who neglect these as subtleties are likely to be the difference between an analytical grade company, and just another guy making a quick buck.

Yet another problem is the actual measurement. Modern methods rely on an electronic measurement that relates to the heat in some manor. Common methods are to sense the temperature directly, most often using a thermistor though the sensing of heat is certainly not limited to thermistors. Thanks to the works of Steinhart and Hart in their oceanographic work, the thermistor has become a much more useful device. But the Steinhart–Hart equation is only the beginning. Each thermistor is different, even those of the same batch, material type, and lot number. Before a thermistor can be calibrated it must be modeled. After the modeling (ie, extraction of material and mass specific coefficients that properly represent the device) the mass-heat index must be cited. All thermistors will self heat to some degree and this is a direct result of the thermistor insertion in to the electronics circuit. Manufacturers tend to cite the"self heating" as "information purposes only", that is, it must be determined experimentally if you expect a reasonable level of performance. I suppose a loose term for this effect could be "insertion heat of a circuit" because it is the assumptions of the circuit which cause the device to heat to begin with. The electronics alone are another source of problems. The amplifiers and data converters must be housed in a thermal oven with precise temperature control in order to reduce drifting due to thermal gradients. Once again, another subject that I could write a book about but I have neglected to mention all of this with good reason.

Errors in sensing elements are accumulative, accumulative errors are transformed in to multiplicative errors when using signal condition circuitry. So yes, we really do need to consider the effects of everything when dealing with calorimetry.

My work brings me very close to theoretical physics, and with that I tend operate from a paradigm in which most people will never know exists. In other words I'm very deep in the rabbit hole. 

In abstraction there are two paths we can take in our home analysis using linux.
  • Our results are traceable, thus we sell the data we produce
  • Our results are not traceable, thus bogus by any scientific measure

I made the statement early on that I do not intend to sell anything, I don't have a "product". This puts me in the second category that my results are not traceable, and thus bogus by any scientific measure. I understand and accept this as fact. The point is that the results work for me, they do what I claim they do even though I have no intentions of proving anything beyond what I would consider a reasonable doubt. I have no hidden incentives here. 

I don't know what it is that you are trying accomplish. Nor do i know the level of precision you expect to achieve.

For me, My goals are to simply cite energy use and comparison to a reasonable degree of accuracy so that I may make informed decisions in order to drop my electric bill, and hopefully one day go off grid. For the sake of the argument I'll state that I expect to see results that coincide with my electric bill no greater than ten percent. I think that's reasonable, don't you?

Since I'm not selling data I don't have to prove anything, and that is part of the problem with most products. They can claim anything they want with virtual immunity to legal action. On the flip side anyone using a product that is not traceable, can not use the data in a legally defensible manor. Legally defensible data is obviously something that people expect, otherwise I'd not have the job that I do. But I do expect my results to have a degree of accuracy even though I'm not selling them, as I will no doubt look back on these posts in order to remember something specific. I'm not about to short change myself here in order to make something look better than it actually is.

Realistically, most people do not expect traceable analysis, and with that there is a built in notion, or assumption that the results are as-is. This bring me to the TED, obviously the device does not produce legally defensible data. If it did, it would cost ten times what it does and require yearly calibration from a traceable source and professional installation.

But we understand (Or i hope we understand!) that the TED can not produce legally defensible data. The TED relies an element of common sense that if we make a change we should be able to see that change numerically. It's not to far of a stretch to say that the TED was designed to operate in a rather large range of conditions. Considering the costs, and broad range of operation, and my experience designing embedded systems and laboratory experience, It's also not a far stretch to say that the TED is not terrible accurate, precise, or repeatable.

I could have made this statement without ever using the TED with a fair assumption that the statement itself is true, but since I do own a TED I happen to know this statement is true. Now, knowing that the TED is not terrible accurate, precise, or repeatable there are no indications that It could ever be used for something as complicated as calorimetry.

Fortunately, the TED is good enough to be used for general power analysis. When I say general power analysis, I suppose I mean that it will give better results than manually reading my meter and jotting the reading down on paper.

Back to the subject of comparing the hot water heaters. I realize that this is a complicated comparison because the heat pump extracts heat energy from the local environment. One good argument against this is the winter time when I must pay to heat my home. In this case I will most likely switch back to resistive heat(Honestly I'm interested to see how the modes compare). In the summer this is two fold as I get free air conditioning while heating my hot water. How much free air conditioning? I have no idea. Unfortunately I do not have enough control over my energy use to ever cite reduction of air conditioning due to the heat pump hot water heater. By this time next year I will likely have made drastic changes regarding my energy foot print and will never be able to cite the exact changes due to the heat pump hot water heater alone.

I've already installed a low flow shower head, and next is an energy reclaiming heat exchanger that will preheat the cold water supply to the hot water heater. Before the summer of 2012 I hope to have 2KW of hybrid solar panels installed, this will also help preheat the cold water to the hot water heater as well as power the lighting circuit and computer circuits in my home.

In addition I'm currently evaluating an induction cook top as a possible replacement for my costly electric range. If the induction cook top proves to be cheaper to operate(25% or better), this will be yet another change implemented to help drop my power use.


Here is the latest graph regarding hot water power use.




GAR

  • Full Member
  • ***
  • Posts: 131
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #21 on: September 04, 2011, 06:51:10 PM »
110904-0943 EDT

iteration69:

Thank you for the broad description of your background.

From my side:

My background is electrical engineering. My experience has ranged from radio, electronics, sound, electrostatic printing, solving several major troubleshooting problems on BB-64 in Korea, vacuum tube and transistor measurements, I made a point contact transistor in 1952 in my YMCA room in Brooklyn while on active duty (I lived off base and wore civilian clothes, and I was in group at the shipyard working on instrumentation for transistors, but I was working on a vacuum tube tester), development of psychophysical measuring equipment used in experiments on signal detectability, automatic random number generation, automotive ignition systems, secure communication, industrial computers, welding equipment, liquid level controls, circuit breakers, smoke detectors, industrial gaging equipment, and troubleshooting automotive assembly lines.

My experiments indicate that the basic TED MTU is a moderately accurate power meter. Very much better than 25%. It uses a chip that is being used in electric company kWh meters. Zero stability seems to be good. Balance between the two current sensors seems good. I can not speak for the linearity of the current sensors.

Using a pair of Fluke 87s I doubt that I could get a power accuracy of better than maybe 4% just based on their specifications. With calibration it could be better. But what do I have for calibration standards to do that? Not much. With calibration the TED should be better than 1 %.

More on your water heater discussion later.

.

   

iteration69

  • Newbie
  • *
  • Posts: 26
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #22 on: September 06, 2011, 10:35:13 AM »
GAR

Some history:

Around Jan, of 2011 my electric use started to increase. At the time I did not have a TED 5000. I was using my fluke 123 and high current probe (i think it's 800amp max) trying to narrow down which circuit(s) the load was on. I'd put the meter on trend plot, and go to work. Come home and check the plot. No signs of peak. I was also reading my meter manually at this point and jotting it down. Very tedious, and very inaccurate. I think most of my manual recording errors where due to the fact that is was blistering cold outside while i tried to read the meter. But none the less I was able to graph the data and see that something was most certainly wrong.  My usage increase was almost logarithmic!

This was certainly not normal. I kept fighting and using the scope meter with trend mode, and after two months i was able to determine part of the problem, but I only knew the cycle not the event. Using a little logic I narrowed down circuits that were actually capable of carrying the large currents that I noted on the graph. Still manually reading the meter and trying to get lucky and catch the even with my scope meter I determined that my problem was two fold. A great deal of power was being drawn from my water heater and electric range. I had enough data that I could prove that the electric range was no at fault, it was my wife. As you know each device has an energy signature and i could see this signature even with my rough approach. I proved graphically that the electric range had excessive use from monday - friday. (When my wife was home). On the week ends, when she would work, The excessive use did not exist ( I continued to use the electric range normally so that i would not bias the analysis). With this I proved that the electric range was not at fault and it had to be something my wife was doing. I talked to her about it and she could not think of anything different even though the graphical analysis said differently.

As far as the existing water heater? I knew it was getting bad. It would bang and pop when it turned on and I tested it to make sure it was not leaking any current back through the ground wire, or through earth ground via the cement floor in the basement( ie, electric shock hazrd) but it was ok in that regard. Not knowing exactly how a good hot water heater should perform, I actually started to whip up a program to fetch data from the scope meter and log it, when i asked myself "what the hell am i doing, there is a device that will do this"  Mind you, I work a lot and whipping up a custom program would have been 10x as costly to me due to time constraints, as it would be to buy the TED 5000. Not knowing much about TED I only bought the gateway and a single MTU. The first thing I did even before setting up the MTU was to give myself access to the gateway remotely. -- It seems that stuff always breaks when I'm not home, and what a great way to keep and eye on things.

Giving my wife access to real time power use solved the problem of excessive energy use due to the electric range but the hot water heater gremlin still existed. I seriously regret not being able to log the data in the first month or two. At this time i had a fan motor fail in the outdoor unit of my split system heat pump (air to air). I was at work and my wife called and said the heat was not working. I was able to see something was different when I pulled up the footprints web interface at work. Within a few minutes, while working with my wife over the phone i was able to determine that the outdoor unit fan motor had failed. Fortunately, I have 4 sources of heat in my home so loosing the heat pump for a few days until i could find time to replace the motor was not a major issue.  Knowing that I was going to replace the hot water heater, and not being able to log the data was a major problem. So I came up with the logging solution I am using now.

I bought another MTU and hot water heater the same day. Installed the hot water heater, swapped the MTU from mains to hot water heater(because i had to wait on the addition MTU to be shipped) and began logging.  I did not have net use during the log period because i swapped the MTU over to the hot water heater. Looking back i should have approached the problem differently in order to yield more data, but excessive energy bills were a bigger concern at the time. By the time i replaced the hot water heater my electric use was nearly 5x the normal.

As it turns out the TED 5000 is much better than my scope meter would have been. The biggest problem with the scope meter would have been the current clamp. On AC the baseline is 1amp, with a single leg that's a minimum of ~110 watt.  So that's a considerable margin when trying to find spurious loads.

I recently bought a TED 5000 + 4 MTU's.  I also have a plug computer that will provide the same level of logging that i current have in my home.  My intentions are to detailed energy analysis for friends.

Have you studied the signals from the current sensors to the MTUs?  The reason I ask is because I will probably extend the wires by 5 - 10 feet just to make it easier to install in a temporary fashion. Worst case I'll add a buffer if the signal degrades. Considering the current wires, i don't think this will be a problem though.

If you have not studied these signals I certainly will when the new TED 5000 arrives.

GAR

  • Full Member
  • ***
  • Posts: 131
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #23 on: September 06, 2011, 04:53:36 PM »
110906-0557 EDT

iteration69:

What was your wife doing with the oven that caused such a high energy use? Heating the house possibly.

Yes the TED system is a very good system for logging power consumption, and the Fluke is not. TED has good zero stability, and good accuracy.

At the present time I am almost certain that TED uses a current transformer for its current sensor. The only reason for any doubt is that I believe early on that verbally someone at Energy told me Hall devices were used. With actual devices in hand a Hall device is ruled out. There are only two wires to each sensor. Although if I had to, it would be possible to make a Hall sensor work with two wires, but four wires would be usual.

The very nature of current transformers requires that they be terminated in a low resistance, ideally 0 ohms. An open circuit on a current transformer can cause high voltage at the output and insulation failure, and possibly shock. The TED current transformers do not need to be plugged into the MTU to prevent excess voltage. Thus, one of two things exist or maybe both in the TED sensor assembly. A calibrated shunt (moderately low resistance resistor) that converts the current transformer current to a voltage, and/or a bidirectional voltage clamp (back to back diodes for example, or a Transorb).

Some rough measurements on a 1000 series current sensor. No load on current sensor and 5 A  output voltage is about 0.085 V, and connected to the MTU box 0.072 V. In the MTU box is a voltage divider consisting of 49.9 ohms and 1300 ohms nominal values. I could not read the marking on the 1300 ohm, but this is a nominal value. Measured resistance of one was 1296 ohms. From the above measurements I am estimating the load resistance in the current sensor is 250 or 249 ohms. #22 wire has a nominal resistance of 16.14 ohms per 1000 ft at 20 C. A 50 ft run or 100 ft of wire is thus 1.6 ohms. This is less than the probable tolerance of the 1300 ohm resistor, 1%. Long wires on the current sensor should be no problem. Use twisted pair. You could do a distance of 1000 ft (2000 total loop) and do a slight calibration adjustment.

The approximate open circuit voltage at 200 A is 3.4 V. The current thru 250 ohms at 3.4 V = 0.0136 A. The ratio of 200 to 0.0136 = 14705. Thus, it appears there are more than 15000 turns on the current transformer secondary. Could be many more because of the internal resistance of the coil. This seems unlikely. So are there many fewer turns and a lower resistance, and then something in the range of 250 ohms added in series with an output lead to limit output current. If one set maximum power dissipation in the current transformer resistor at 1/2 W, then that resistance could be in the ballpark of 25 ohms. Then the turns would be 1/10 the above. Seems more reasonable.

You use the term mains. Does this mean you are in a British area? Canada possibly?

.

iteration69

  • Newbie
  • *
  • Posts: 26
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #24 on: September 07, 2011, 07:46:29 AM »
GAR.

I was leaning towards my wife turning the oven on and sitting in front of it for heat. But, i have a pellet burner that will heat us out of home in no time. So if she was cold she would have simply turned the pellet burner on high and sat in front of it. (At least logic would dictate). I asked her about it and she said she was not turning the oven on and sitting in front of it. So I believe her(I have no reason not to trust her). My only guess is that she got in to some bad cooking habits. Turn on the oven to make a single piece of chicken, or cooking just enough food for a single day -- type of bad habbit.  I did notice that the leftover ratio was slacking during that time period, and I really enjoy left overs so it was bothering me. ;)

I just received the TED 5000 + 4 MTU's today. Just now inspecting everything. I will make sure everything is working before i do anything which will void the warrenty, then I'm going to add longer wires to the MTU current clamp. Feeling around the label on the current clamp, the side with "FCC" feels like there may be a screw under the label. If i can get the current clamp apart without damaging it I'll try to get some pictures for you.

Thank's for the heads up on the signal. About the "mains" , this is something I picked up from working around laboratory equipment. I'm actually located in Friedens, Pennsylvania.

I'll let you know what I make out with the MTU's.

To be continued...
« Last Edit: September 07, 2011, 07:49:58 AM by iteration69 »

iteration69

  • Newbie
  • *
  • Posts: 26
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #25 on: September 08, 2011, 04:55:40 AM »
Before testing the new TED 5000, I decided to re-work my existing TED 5000 that is in my home. I shortened the power leads to make it look better and that was the start of major head aches. Apparently the length of wire i had (coiled up with zip ties) was acting like an inductor and filtering the noise on the lines perfectly because now, none of the MTU's are getting data back to the gateway. There is literally less than 12 inches of wire between the MTU's and gateway. The only thing i changed was the length. Nothing, nada, zip. Dead in the water. So i tried to reset the gateway and then it locked me out. It kept asking for a password even though i never changed any configuration. I ended up hard resetting the gateway, which brought the web interface back but i lost all the setup. All MTU's are sending at the same time which is causing all sorts of reflections on the line.  Funny how their algorithm will even allow this to happen. Hey guys, ever hear of CSMA/CD ? What can they say, they don't have enough resources to implement? Well if i can implement a modified CSMA/CD on an Attiny2313 micro (128bytes ram, 2k flash) I'm sure they can pull this off with their PICs.

I forgot about all the BS we have to jump through, Had to configure the router for DHCP just so i could reconfigure the IP. I figured i could just add a rule based on the mac but they don't print the mac address on the device. Come on guys, do i really have to go through this BS for every device i plan to set up!!?

For most people i suppose this is not much of a problem, but for me, i have dozens of network enabled devices and three different networks here. To turn on DHCP opens up a can of worms due to some other half-assed devices which take stupid friendly to the next level. The ROKU being one of the major players in lack-o-advanced configuration. They will grab up the DHCP IP slot asap(even though they already have a lease, does not matter if it is 12 hours or 12 seconds, they want a new address and you can't do a damn thing about it -- zero options).  This stupid friendly configuration really eats at me, but at least the TED 5000 allows us to manually set the IP after our routers and devices eventually negotiate an IP. Of all the devices on my network guess which devices cause all the problems? You got it, stupid friendly using DHCP. I could easily say I hate DHCP, but it's not the protocol to blame. It's the half baked non sense of trying to make everything stupid friendly which causes these problems.

Due to the current bump in road i've not been able to test the new TED 5000. I'm most certainly printing the mac address on each gateway from now on. I may set up the lan port on my laptop to be a DHCP server so that i can force IP addressing without mucking up my heavily used network.

I'm at work now, blind to my energy use and so is my log. Footprints is showing static use on everything. Remotely reset it, still no go. Big ole whole in the data all because i wanted to make my set up look good. Should have let it alone.

I guess I'll be making a filter when i get home.

Ugh..

iteration69

  • Newbie
  • *
  • Posts: 26
  • Karma: +0/-0
Re: TED 5000 data logging and analysis with GNU/Linux
« Reply #26 on: September 10, 2011, 05:15:38 AM »
So i just found via other resources that the user GAR has been banned for "Excessive useless information posted".

Look his posts over, i don't see anything "useless" about what he or others have been sharing while trying to make this product work!


Not Happy, Not happy at all.