[Help]Hp 50g : working with larger-than-memory datasets
|
03-28-2017, 06:26 AM
Post: #7
|
|||
|
|||
RE: [Help]Hp 50g : working with larger-than-memory datasets
(03-28-2017 05:47 AM)cyrille de brébisson Wrote: So if you only modify small part of the data each time, you would be better of separating your data in constitutive elements, or at least is very small chunks. for example 10 elements at a time in a variable. Thanks for the tip. I know that the flah are quite easy to worn out with write cycles (I know it from years of using openwrt on very limited hw, then all the smartphones, tablets, etc.). Of course the size of the writing action has to be "smart" but I was expecting that the port 2 (internal flash) has nevertheless wear leveling algorithms, so one could use it for pretty long time. Am I wrong on this one? Wikis are great, Contribute :) |
|||
« Next Oldest | Next Newest »
|
Messages In This Thread |
[Help]Hp 50g : working with larger-than-memory datasets - pier4r - 03-26-2017, 11:11 AM
RE: [Help]Hp 50g : working with larger-than-memory datasets - Han - 03-26-2017, 01:57 PM
RE: [Help]Hp 50g : working with larger-than-memory datasets - pier4r - 03-26-2017, 06:34 PM
RE: [Help]Hp 50g : working with larger-than-memory datasets - Claudio L. - 03-27-2017, 12:21 AM
RE: [Help]Hp 50g : working with larger-than-memory datasets - pier4r - 03-27-2017, 09:44 AM
RE: [Help]Hp 50g : working with larger-than-memory datasets - cyrille de brébisson - 03-28-2017, 05:47 AM
RE: [Help]Hp 50g : working with larger-than-memory datasets - pier4r - 03-28-2017 06:26 AM
RE: [Help]Hp 50g : working with larger-than-memory datasets - cyrille de brébisson - 03-29-2017, 05:42 AM
|
User(s) browsing this thread: 1 Guest(s)