I feel like this would have something to do with solving the heat equation in polar coordinates
There are no such thing as curves you are looking at spikes hastily scrawled as a direct correalated ratio to many other spikes. The idea of curves is nonsense
If a system x of knowns equitable to a system y of unsolved constraints may contrive a system of z then in z1 can be equalized throughout it's own systems of x and y. Then z2 acomparitive model interact in a contrived measure that both compensate for z3 a system of comparisons of knowns x y and z states. That system example 1 is definable in trial and error for the remodeled versions by system 2. It would arguably become more likened to the letter that by system 3 it would be marginally acceptable by implication of process the qualifications would Express by a continued adjustment. This model would then be determinable as the z1 state an z2 state if an adjusted state of z3 is applied. So z3 becomes baseline representative in or translated for any series. My dood has batter that dood has bisquick and some amount of either is the shake and bake. I'm so bored as fuck rite now but it's like 500 pages of how to cook it rite. U got density and shit like wave mechanics and vacuums and resonance and harmonies disputing all of or supporting for either depending on shit. Just saying its a lot of dynamics for shit like anti poles or some shit I still say u r just redefining source and field properties of direct linear values to fabricate a fake believe thing. Like only one example of a curve existing and that's by spontaneity in all probable outcomes it just does by a everdisrupted value in scalar chance like a pussytickler in math
I recently stumbled upon this video about Geometric Algebra.https://www.youtube.com/watch?v=60z_hpEAtD8
I learned a bit about it before but this - while he admits the video isn't a teaching guide - it is a pretty good 'showcase' as to what Geometric Algebra is about and what can do.
It's a shame this approach isn't taken in most college physics courses.
This sounds like you're talking about a subject in complex analysis (i.e. calculus with complex numbers) called 'conformal mapping'. Not only does it offer an approach to turn any shape into any other shape it takes the interior of the shape and 'bends' it so that, at least on a microscopic level, it doesn't look like you've altered any angles (i.e. squares still look like squares)
Remember those cheesy 'morphing' effects you'd see all the time in the 1990s. That's done with conformal mapping.
I just seen that.
1:There is no other disruption as noticeable as the coursing of time. A value that is in theory always changing. Its directly corresponding to the flow of observed information. Prioritized data, assessed through a 'center point' aka reality. The observation is simply a mitigated data of the most availiable flux -curve of its origin or signature (in time). Approximated.
This means an averaging of fall off (exposure) to the center of the universe in its beginning locale(s), is our frame of reference, a singularity. The source which we do not see is measured-in more extreme cases only by distance or 'rate of observed data', suggests we are polarized by flux. We move slower than what we see, and data is only approximate to what we are informed, 'the future-past' curves an occulus (particle).
2:Reality is ever expanding, a polar flux filtered by frequency of observation (critical mass in casualty of half life). The more 'energetic' or 'exposed' something is the more information is mitigated. (Data is assessed at a faster rate and more is overlooked) We move faster with higher fallout rates and closer to critical mass, aka 'annihilation' or polarization. This process of this is Curve Theory.
3:In it the logistics of handling data transfer is by the amount of particle annihilation taking place. Prioritized routing is by the degree or radian in flux or about its angle being up or down. Using a prediction of how much until it occurs in casualty or is adjusted or otherwise, depends on the amount of exposure the data has (it is theorized it increases with speed or by chance). A selected flux is assigned a prefferred observation to the rest (by favorable signatures), the data logistics is made more controllable. (So long as that system takes place) a pathological gyroscopic position in which observations are predicted based on its orientation in time can be programmed for data mining.
4:So you have a gyroscopic positioning system that collects favorable weather reports of the best route through time possible, that you can supercede time by overstacking data and energy (by changing the position correctly). To flatten the curve basically, less bridging in referenced frames takes place *(or something) and faster than light travel or a super fluid may be possible. Esp with a Gun on it.
It really depends on where you want to start.
You can say that a polarized issuance of data centeralizes its alignment as having all or nothing from a source in a timeline.
Or a more complex system was in place involving atomic weights that would decide the same. Hydrogen and Helium just happen because they pretty much want to not because they are supposed to. So obviously there are intrinsic issuances that maybe, get more or less complex depending.
But when you put those two together, you could basically punch it all a new butthole and open a portal into some dimension where infact (its both) that is if you don't get basically liquified by gravitiy welling up from doing so.
Sending a Value of 1 to point A to point B
Sending a dirty water molecule to a water treatment to a drinking fountain. Sending a "hello" signal from Earth To mars. Rumors of Japan is refining their trains and railways becoming faster than jets soon the Japanese will create some portal to deliver meat less than seconds away.
You may or may not agree with experimental ideas of firmament but there are alot of them, for me specifically how a photon transcedes most of physical rules, it seems to act as a fuel unit for alot of the rest of the universe, no? Understand the levels at which that may operate unlocks some very interesting stuff beyond just ufos and things like residual light. But, IMO, I think using photons to encode virtual space and then print it is the future of things. Using a mixture of 4 staged existential theorums. Not to make it sound complicated.
Though I want it to be true first all does not mean it is. A data system, as described as I have along with a material emplacement of atoms obviously they would break down and do what they do in a hybrid system. Some say that is real enough but the breaking down promotes only more questions as to how things really work.
Those special particles they dont teach because its still not really proven, but, it makes sense that they have to abide to at least SOME level of existence. We know there are pretty ways in which things are applied and in certain conditions.
So using them becomes mildly put a quantum realized world, as it is, that means we can further exploit it into a new approach. Using a fissure of all things to cherry pick and reconstruct the world in their new design, that seems to be alot of what has just been 'described as our by sheer chance universe'. Having photons and a the reins of magnitude I think it is possible.
The issue comes from just how put those it together since the 'put' part is a big if.
Just for an eye opener honestly its a little daunting, for example. Pure photons unobserved and by en masse amount come from only by uh…lightning maybe and uh…a nuclear explosion….how are we going to "harness" a nuclear on demand thing like that. We would need a massive super colliding centrifuge of course only one place we know has that kind of thing and no they aren't letting you stick a N-Pill in it. So its kind of a hard kickstart.
Next in line even if we get that we need a means of printing it with the very same amount of "light-dispersion" for all the parts need to be perfect down to their chemistry. And that's gonna take one hell of a laser the only thing closest we have is maybe the very same collider that breaks open particles in the first place. Just modified for a 3d-sort of printing area.
That is all I can honestly say it would have to be done from space more than likely like a big tattoo gun.
Is knowledge of all this math/computer nerd stuff a turn-off or turn-on for you when looking for a partner?
math and EE are some of my passions in life. It would be kind of hard to relate to someone if we didn't share interests.
Well. Effectively you have just removed yourself from the Gene-pool. Thus proving Intelligence is potential counterproductive to evolution.
Interesting since Darwinism states the survival of the fittest is essential but at some level when beings with higher brain functions decides NOT to procreate then evolution has hit a dead-man switch of sort.
Since when did mathematicians and engineers NOT get laid?
The vast majority of humans value intelligence and the ability to impact the world. Something most, also, lack.
I have NEVER seen an EE being turned down over blue-collars unless they're unemployed at the time, especially women. Women in STEM get mistreated and underpaid so companies love to hire them to get quality work for cheaper.
mods, please shadowban this buzzword-spouting spamming schizo
math is fake libtard shit
what the fuck schizo shit did you guys invite into my thread
I just wanted to talk about differential equations and curvature
Pic looks like a fox's anus. We should rename our galaxy to the Fox Anus Galaxy.
>>3620608>I just wanted to talk about differential equations and curvature
burn in hell, faggot
Not sure what I want to suck on more, her toes or her tailhole.
>>3620569> shadowban this buzzword-spouting spamming schizo
Does lulz even have a "shadowban" feature?
mods? you mean the site admin who actually is fair and based.
geometric algebra seems pretty swanky tbh
I wonder if I can find any applications to signal processing with it
This is a Fourier analysis problem. Watch this video:https://youtu.be/r6sGWTCMz2k
So now you know how to trace your loop using a series of epicycles. The trick is to just smoothly scale the c_1 component to the radius of the circle with the same circumference as the arc length of your loop (c_1_final = r = arc length / 2π) while also smoothly scaling all other epicycle terms to zero except c_0. The c_0 component is just the vector pointing to the centroid of your loop as mentioned in the video. You would need to finely tune the scaling rate of each term if you want the arc length to stay the same throughout the process. I don't know how to do that. Maybe there is no general method for arbitrary loops.>>3620459
Does anyone else work the challenge problems from this guy's videos? For squares as the input (highlighted numbers), I got for the nth term of the Moessner sum: (n!)^2/n
Highlighted numbers in the top row that is.
another pair of probably trivial simplifications
I realized the right hand side could be "simplified" somewhat with a substitution but I don't think it actually accomplishes much
You have a series that basically is different by step to represent something whole overall using knowns and unknowns.
You can break the steps down into a whole answer with each step arriving at a seperate solvability of a supposed patternSum.
But that does not outright mean they are congruent, even if they are perfectly aligned to be a new part. (There is no explicitly real loop unless it is a predesigned as toymath to be such.)
Fourier is trying to connect parts in which clues may surface as to the progression of the series. If you do KNOW that n=10 progression into n=20 what exactly is that likely. That is leaving the "jump" to interpretation, so…:
When there are no such thing as ellipses in a set where the patterns make ellipses then it may infact show, that the series is a pattern that simply loops as a toroid with you having just defined each layer further. *It just may not be so even if it looks so*. So the progressions might just be anything *(including integrals now that its been set to a overall fourier measure).
Knowing Zero you have just mapped points that either are the same, are congruent, or are are just different instances of points in a graph. So you can triangulate them of course that is what fourier is probably trying to suggest originally. (You can also submit them to other functions.)
With fourier you are hoping a script is going to promote integer progression since you assume to have all unknowns defined (but the unknowns are going to still show up more than likely again.) You aren't solving them per sae because you've balanced them to a 'progression', they are originally fabricated based off a concept within triangulating zero into itself "a solved quotient" or an infinitisimal and therefore useable "special case"
So having done that your new 'forms' would show its case for congruency to the original. That is all that they can really do.
If they are perfect then they would be indistinguishable while perhaps allowing for a seperate condition in which you can rewrite the series notation for where specific notation are required.
For instance with a seperate comparitive model in a different notation, you would want to make -sure- that their forms each can effectively show that mode for each other.
And if it were to the preference of design, perhaps they are part of the same tangent, then that tangent can be recomposed under those Series which you now have outlined.
So by special works like chainblock or graphing 'peaks' and curve-points of the Series; with what you have outlined to find a "very special" case integer in tangent. So it narrows down guesswork.
I guess what I am trying to say is that in fourier's analysis it would occur in the progressions variable consistency checks as the complexities arise the less accurate the overall result becomes.