Linear Resampling Datapoints Captured at Fluctuating Time Intervals, to Flxed Time Intervals, in Swift

Linear resampling datapoints captured at fluctuating time intervals, to flxed time intervals, in swift

I don't understand the math you want to do 100%, but I do understand how to use Accelerate. I created a function which makes it easier to call this Accelerate function and shows you how it works.

/**
Vector linear interpolation between neighboring elements

- Parameter a: Input vector.
- Parameter b: Input vector: integer parts are indices into a and fractional parts are interpolation constants.

Performs the following operation:

```C
for (n = 0; n < N; ++n) {
double b = B[n];
double index = trunc([b]); //int part of B value
double alpha = b - index; //frac part of B value

double a0 = A[(int)index]; //indexed A value
double a1 = A[(int)index + 1]; //next A value

C[n] = a0 + (alpha * (a1 -a0)); //interpolated value
}
```
Generates vector C by interpolating between neighboring values of vector A as controlled by vector B. The integer portion of each element in B is the zero-based index of the first element of a pair of adjacent values in vector A.

The value of the corresponding element of C is derived from these two values by linear interpolation, using the fractional part of the value in B.
*/
func interpolate(inout a: [Double], inout b: [Double]) -> [Double] {
var c = [Double](count: b.count, repeatedValue: 0)
vDSP_vlintD(&a, &b, 1, &c, 1, UInt(b.count), UInt(a.count))
return c
}

EDIT: Alright, I wrapped my head around your problem, I understand now what you want to do. Was pretty fun to do, I came up with this:

import Accelerate

func calculateB(sampleTimes: [Double], outputTimes: [Double]) -> [Double] {
var i = 0
return outputTimes.map { (time: Double) -> Double in
defer {
if time > sampleTimes[i] { i++ }
}
return Double(i) + (time - sampleTimes[i]) / (sampleTimes[i+1] - sampleTimes[i])
}
}

func interpolate(inout b: [Double], inout data: [Double]) -> [Double] {
var c = [Double](count: b.count, repeatedValue: 0)
vDSP_vlintD(&data, &b, 1, &c, 1, UInt(b.count), UInt(data.count))
return c
}

let sampleTimes : [Double] = [0.0, 1.3, 2.2, 3.4, 4.2, 5.5, 6.6, 7.2, 8.4, 9.5, 10.0]
let outputTimes : [Double] = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]

var metric_1 : [Double] = [4, 3, 6, 7, 4, 5, 7, 4, 2, 7, 2]
var metric_2 : [Double] = [5, 4, 7, 5, 6, 6, 1, 3, 1, 6, 7]
var metric_3 : [Double] = [9, 8, 5, 7, 4, 8, 5, 6, 8, 9, 5]

var b = calculateB(sampleTimes, outputTimes: outputTimes)

interpolate(&b, data: &metric_1) // [4, 3.230769, 5.333333, 6.666667, 4.75, 4.615385, 5.909091, 5, 2.666667, 4.727273, 2]
interpolate(&b, data: &metric_2) // [5, 4.230769, 6.333333, 5.666667, 5.75, 6, 3.727273, 2.333333, 1.666667, 3.727273, 7]
interpolate(&b, data: &metric_3) // [9, 8.230769, 5.666667, 6.333333, 4.75, 6.461538, 6.636364, 5.666667, 7.333333, 8.545455, 5]

The vars are necessary for Accelerate. I don't know how calculateB could be done with Accelerate, I mean it's possible I think, but it's a pain to search for the correct vDSP functions...

What is the best way to interpolate points between coordinates in google path? Is this approach correct?

Google itself has provided various methods for interpolating within GMSGeometryUtils module. I think what you need for interpolating could be the:
https://developers.google.com/maps/documentation/ios-sdk/reference/group___geometry_utils.html#gad0c5870bd9d182d22310f84a77888124

GMSGeometryInterpolate uses shortest path between 'from' and 'to' coordinate that you provide at a given fraction, and GMSPath indeed connects shortest paths between each sub-sequential coordinates you provide so it should be sufficient.

How to make multiForm data post request in swift?

I figured it out,

first of all, I should have checked if

var sum = CreateAccountAuth.shared.userDetails?.id

is nil or not and it turns out it is nil, I would use UserDefaults to store that Int value from now on to persist it

second, the form takes in both Int and String as Text which means the code I wrote for converting Int to data is not needed
simply writing it using string interpolation

 "\(Integer)"

and third of all whenever sending an image we need to give it a filename, my code doesn't have it so it was failing

so here's the revised code, I am using Alamofire now since it is much easier

func uploadPost() {

guard let imageData = self.image?.jpegData(compressionQuality: 1) else {
print("the image data is empty")
return}

AF.upload(multipartFormData: { (multipartdata) in
multipartdata.append("\(2)".data(using: String.Encoding.utf8)!, withName: "metauserID")
multipartdata.append("\(1)".data(using: String.Encoding.utf8)!, withName: "boostTagsID")
multipartdata.append("\(1)".data(using: String.Encoding.utf8)!, withName: "shop_ID")
multipartdata.append("\(1)".data(using: String.Encoding.utf8)!, withName: "discount_ID")
multipartdata.append("\(1)".data(using: String.Encoding.utf8)!, withName: "product_categoryID")
multipartdata.append(self.artworkDescription.data(using: String.Encoding.utf8)!, withName: "producDescription")
multipartdata.append(self.artworkTitle.data(using: String.Encoding.utf8)!, withName: "productName")

multipartdata.append(imageData, withName: "product_image1", fileName: "\(UUID().uuidString) + .png", mimeType: "img/jpeg")
//multipartdata.append("\()", withName: "metauderID")
}, to: "my Backend URL").responseJSON { (data) in
print(data)
}

}

CoreAudio: change sample rate of microphone and get data in a callback?

The render callback is used a a source for an audio unit. If you set the kAudioOutputUnitProperty_SetInputCallback property on the remoteIO unit, you must call AudioUnitRender from within the callback you provide, then you would have to manually do the sample rate conversion, which is ugly.

There is an "easier" way. The remoteIO acts as two units, the input (mic) and the output (speaker). Create a graph with a remoteIO, then connect the mic to the speaker, using the desired format. Then you can get the data using a renderNotify callback, which acts as a "tap".

I created a ViewController class to demonstrate

#import "ViewController.h"
#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>

@implementation ViewController

- (void)viewDidLoad {
[super viewDidLoad];

//Set your audio session to allow recording
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:NULL];
[audioSession setActive:1 error:NULL];

//Create graph and units
AUGraph graph = NULL;
NewAUGraph(&graph);

AUNode ioNode;
AudioUnit ioUnit = NULL;
AudioComponentDescription ioDescription = {0};
ioDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
ioDescription.componentType = kAudioUnitType_Output;
ioDescription.componentSubType = kAudioUnitSubType_VoiceProcessingIO;

AUGraphAddNode(graph, &ioDescription, &ioNode);
AUGraphOpen(graph);
AUGraphNodeInfo(graph, ioNode, NULL, &ioUnit);

UInt32 enable = 1;
AudioUnitSetProperty(ioUnit,kAudioOutputUnitProperty_EnableIO,kAudioUnitScope_Input,1,&enable,sizeof(enable));

//Set the output of the ioUnit's input bus, and the input of it's output bus to the desired format.
//Core audio basically has implicite converters that we're taking advantage of.
AudioStreamBasicDescription asbd = {0};
asbd.mSampleRate = 16000.0;
asbd.mFormatID = kAudioFormatLinearPCM;
asbd.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
asbd.mFramesPerPacket = 1;
asbd.mChannelsPerFrame = 1;
asbd.mBitsPerChannel = 16;
asbd.mBytesPerPacket = 2;
asbd.mBytesPerFrame = 2;

AudioUnitSetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &asbd, sizeof(asbd));
AudioUnitSetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &asbd, sizeof(asbd));

//Connect output of the remoteIO's input bus to the input of it's output bus
AUGraphConnectNodeInput(graph, ioNode, 1, ioNode, 0);

//Add a render notify with a bridged reference to self (If using ARC)
AudioUnitAddRenderNotify(ioUnit, renderNotify, (__bridge void *)self);

//Start graph
AUGraphInitialize(graph);
AUGraphStart(graph);
CAShow(graph);

}
OSStatus renderNotify(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData){

//Filter anything that isn't a post render call on the input bus
if (*ioActionFlags != kAudioUnitRenderAction_PostRender || inBusNumber != 1) {
return noErr;
}
//Get a reference to self
ViewController *self = (__bridge ViewController *)inRefCon;

//Do stuff with audio

//Optionally mute the audio by setting it to zero;
for (int i = 0; i < ioData->mNumberBuffers; i++) {
memset(ioData->mBuffers[i].mData, 0, ioData->mBuffers[i].mDataByteSize);
}
return noErr;
}

@end

Xcode 8.0 Swift 3.0 slow indexing and building

I solved the problem by commenting all files and then removing comments one by one. I found that the problem is still in the array declaration as described here.

I had code like this and project was not indexing:

class {
var first: String!
var second: String!
var third: String!
var fourth: String!
var fifth: String!

func abc() -> [String] {
var array = [first, second, third, fourth, fifth]
}
}

I've changed it to this and indexing started working:

class {
var first: String!
var second: String!
var third: String!
var fourth: String!
var fifth: String!

func abc() -> [String] {
var array = [first]

array.append(second)
array.append(third)
array.append(fourth)
array.append(fifth)
}
}


Related Topics



Leave a reply



Submit