text
stringlengths 348
101k
|
|---|
<issue_start><issue_comment>Title: Ozone Read fails with StatusRunTimeExceptions after 2 datanode fail i…
username_0: …n Ratis pipeline
https://issues.apache.org/jira/browse/HDDS-1809
This issue is fixed by the code change in https://issues.apache.org/jira/browse/HDDS-1713.
Previously network topology use Ipaddress as the node key in topology cluster, which results that three sorted Datanodes are the same node.
Now datanode uuid is used as the node key in topology cluster, so sorted Datanodes will be three different nodes now.
<issue_comment>username_1: :broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Comment |
|:----:|----------:|--------:|:--------|
| 0 | reexec | 532 | Docker mode activated. |
||| _ Prechecks _ |
| +1 | dupname | 0 | No case conflicting files found. |
| +1 | @author | 0 | The patch does not contain any @author tags. |
| +1 | test4tests | 0 | The patch appears to include 1 new or modified test files. |
||| _ trunk Compile Tests _ |
| +1 | mvninstall | 633 | trunk passed |
| +1 | compile | 389 | trunk passed |
| +1 | checkstyle | 65 | trunk passed |
| +1 | mvnsite | 0 | trunk passed |
| +1 | shadedclient | 848 | branch has no errors when building and testing our client artifacts. |
| +1 | javadoc | 162 | trunk passed |
| 0 | spotbugs | 457 | Used deprecated FindBugs config; considering switching to SpotBugs. |
| +1 | findbugs | 673 | trunk passed |
||| _ Patch Compile Tests _ |
| +1 | mvninstall | 560 | the patch passed |
| +1 | compile | 371 | the patch passed |
| +1 | javac | 371 | the patch passed |
| +1 | checkstyle | 68 | the patch passed |
| +1 | mvnsite | 0 | the patch passed |
| +1 | whitespace | 0 | The patch has no whitespace issues. |
| +1 | shadedclient | 671 | patch has no errors when building and testing our client artifacts. |
| +1 | javadoc | 164 | the patch passed |
| +1 | findbugs | 689 | the patch passed |
||| _ Other Tests _ |
| +1 | unit | 303 | hadoop-hdds in the patch passed. |
| -1 | unit | 1230 | hadoop-ozone in the patch failed. |
| +1 | asflicense | 38 | The patch does not generate ASF License warnings. |
| | | 7562 | |
| Reason | Tests |
|-------:|:------|
| Failed junit tests | hadoop.ozone.client.rpc.TestReadRetries |
| | hadoop.ozone.web.client.TestKeysRatis |
| | hadoop.ozone.web.client.TestKeys |
| | hadoop.ozone.web.TestOzoneWebAccess |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | Client=19.03.0 Server=19.03.0 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-1143/1/artifact/out/Dockerfile |
| GITHUB PR | https://github.com/apache/hadoop/pull/1143 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle |
| uname | Linux 87eeba2f473d 4.4.0-138-generic #164-Ubuntu SMP Tue Oct 2 17:16:02 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | personality/hadoop.sh |
| git revision | trunk / eb36b09 |
| Default Java | 1.8.0_212 |
| unit | https://builds.apache.org/job/hadoop-multibranch/job/PR-1143/1/artifact/out/patch-unit-hadoop-ozone.txt |
| Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-1143/1/test
|
)
{
if (size < 16)
throw new Exception("Bad FMT chunk-size");
if (hz != -1)
throw new Exception("Has 2nd FMT");
int formatID = readInt(reader, 2);
channelNum = readInt(reader, 2);
hz = readInt(reader, 4);
int bytePerSec = readInt(reader, 4);
int blockSize = readInt(reader, 2);
bitPerSample = readInt(reader, 2);
if (formatID != 1) throw new Exception("formatID is not PCM");
if (channelNum != 1 && channelNum != 2) throw new Exception("Bad channelNum");
if (hz < 1 && 2100000000 / 4 < hz) throw new Exception("Bad hz");
if (bitPerSample != 8 && bitPerSample != 16) throw new Exception("Bad bitPerSample");
if (blockSize != channelNum * bitPerSample / 8) throw new Exception("Bad blockSize");
if (bytePerSec != hz * blockSize) throw new Exception("Bad bytePerSec");
reader.Seek((long)size - 16, SeekOrigin.Current);
}
else if (
name[0] == (byte)'d' &&
name[1] == (byte)'a' &&
name[2] == (byte)'t' &&
name[3] == (byte)'a'
)
{
if (rawData != null)
throw new Exception("Has 2nd DATA");
rawData = new byte[size];
if (reader.Read(rawData, 0, size) != size)
throw new Exception("Read DATA Error");
}
else
{
reader.Seek((long)size, SeekOrigin.Current);
}
}
if (hz == -1) throw new Exception("No FMT");
if (channelNum == -1) throw null; // 2bs
if (bitPerSample == -1) throw null; // 2bs
if (rawData == null) throw new Exception("No DATA");
double[] linear;
double[][] wave;
if (bitPerSample == 8)
{
linear = new double[rawData.Length];
for (int index = 0; index < rawData.Length; index++)
linear[index] = ((int)rawData[index] - 128) / 128.0; // 8ビットの場合は符号なし整数
}
else // 16
{
if (rawData.Length % 2 != 0)
throw new Exception("Bad DATA (rawData size)");
linear = new double[rawData.Length / 2];
for (int index = 0; index < rawData.Length / 2; index++)
linear[index] = ((((int)rawData[index * 2] | ((int)rawData[index * 2 + 1] << 8)) ^ 32768) - 32768) / 32768.0; // 16ビットの場合は符号あり整数
}
if (channelNum == 1) // monoral
{
wave = new double[][]
{
new double[linear.Length],
new double[linear.Length],
};
for (int index = 0; index < linear.Length; index++)
{
wave[0][index] = linear[index];
wave[1][index] = linear[index];
}
}
else // stereo
{
if (linear.Length % 2 != 0)
throw new Exception("Bad DATA (linear size)");
wave = new double[][]
{
new double[linear.Length / 2],
new double[linear.Length / 2],
};
for (int index = 0; index < linear.Length / 2; index++)
{
wave[0][index] = linear[index * 2 + 0]; // 左側の波形値
wave[1][index] = linear[index * 2 + 1]; // 右側の波形値
}
}
return wave;
}
}
private double[][][] WaveToSpectra(double[][] wave, int wave_hz, int fps)
{
double[] SPECTRUM_HZS = new double[]
{
// --https://en.wikipedia.org/wiki/Piano_key_frequencies
|
schema": { "$schema": "http://json-schema.org/draft-04/schema#","type": "object", "properties": {"raw-text": {"type": "string" } }, "required": ["raw-text"], "additionalProperties": false }}'}};
comp.doDFSchemaChange(ev);
expect(comp.schemaMsg).toMatch("Invalid Schema - must specify jsonschema") ;
});
it('JSON schema must be 04', () => {
let ev = {target: {name: "test", value: '{"self": { "name": "CUDA Simple JSON Example","version": "1.0.0","description": "An example of unnested JSON schema for CUDA Input and output"},"dataformatversion": "1.0.0", "jsonschema": { "$schema": "http://json-schema.org/draft-05/schema#","type": "object", "properties": {"raw-text": {"type": "string" } }, "required": ["raw-text"], "additionalProperties": false }}'}};
comp.doDFSchemaChange(ev);
expect(comp.schemaMsg).toMatch(/Invalid JSON Schema Data Format - jsonschema\$schema version must be 04/) ;
});
it('should fail if JSON input and no schema', () => {
let ev = {target: {name: "test", value: '{}'}};
comp.doDFJSONChange(ev);
expect(comp.jsonMsg).toMatch("Enter a Valid Schema") ;
});
it('should validate if JSON input and match valid schema', () => {
let ev = {target: {name: "test", value: '{"self": { "name": "CUDA Simple JSON Example","version": "1.0.0","description": "An example of unnested JSON schema for CUDA Input and output"},"dataformatversion": "1.0.0", "jsonschema": { "$schema": "http://json-schema.org/draft-04/schema#","type": "object", "properties": {"raw-text": {"type": "string" } }, "required": ["raw-text"], "additionalProperties": false }}'}};
comp.doDFSchemaChange(ev);
let jev = {target: {name: "test", value: '{"raw-text": "test"}'}};
comp.doDFJSONChange(jev);
expect(comp.jsonMsg).toMatch("JSON Input Validated") ;
});
it('should not validate if JSON input does not match valid schema', () => {
let ev = {target: {name: "test", value: '{"self": { "name": "CUDA Simple JSON Example","version": "1.0.0","description": "An example of unnested JSON schema for CUDA Input and output"},"dataformatversion": "1.0.0", "jsonschema": { "$schema": "http://json-schema.org/draft-04/schema#","type": "object", "properties": {"raw-text": {"type": "string" } }, "required": ["raw-text"], "additionalProperties": false }}'}};
comp.doDFSchemaChange(ev);
let jev = {target: {name: "test", value: '{"badraw-text": "test"}'}};
comp.doDFJSONChange(jev);
expect(comp.jsonMsg).toMatch("JSON Input does not match Schema") ;
});
it('should recover if schema becomes valid', () => {
let ev = {target: {name: "test", value: '{"self": { "name": "CUDA Simple JSON Example","version": "1.0.0","description": "An example of unnested JSON schema for CUDA Input and output"}, "jsonschema": { "$schema": "http://json-schema.org/draft-04/schema#","type": "object", "properties": {"raw-text": {"type": "string" } }, "required": ["raw-text"], "additionalProperties": false }}'}};
comp.doDFSchemaChange(ev);
expect(comp.schemaMsg).toMatch("Invalid Data Format Schema") ;
let jev = {target: {name: "test", value: '{"raw-text": "test"}'}};
comp.doDFJSONChange(jev);
let ev2 = {target: {name: "test", value: '{"self": { "name": "CUDA Simple JSON Example","version": "1.0.0","description": "An example of unnested JSON schema for CUDA Input and output"},"dataformatversion": "1.0.0", "jsonschema": { "$schema": "http://json-schema.org/draft-04/schema#","type": "object", "properties":
|
was invalidated." UserInfo={NSDebugDescription=The connection to service on pid 0 named com.apple.commcenter.coretelephony.xpc was invalidated.}
2020-05-11 10:05:31.763561-0500 contractorselogs[1142:7697259] [Client] Updating selectors failed with: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service on pid 0 named com.apple.commcenter.coretelephony.xpc was invalidated." UserInfo={NSDebugDescription=The connection to service on pid 0 named com.apple.commcenter.coretelephony.xpc was invalidated.}
2020-05-11 10:05:31.767694-0500 contractorselogs[1142:7697257] [Client] Updating selectors failed with: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service on pid 0 named com.apple.commcenter.coretelephony.xpc was invalidated." UserInfo={NSDebugDescription=The connection to service on pid 0 named com.apple.commcenter.coretelephony.xpc was invalidated.}
2020-05-11 10:05:31.767851-0500 contractorselogs[1142:7697257] [Client] Updating selectors after delegate removal failed with: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service on pid 0 named com.apple.commcenter.coretelephony.xpc was invalidated." UserInfo={NSDebugDescription=The connection to service on pid 0 named com.apple.commcenter.coretelephony.xpc was invalidated.}
2020-05-11 10:05:31.897952-0500 contractorselogs[1142:7697248] 6.13.0 - [Firebase/Analytics][I-ACS023007] Analytics v.60106000 started
2020-05-11 10:05:31.898253-0500 contractorselogs[1142:7697248] 6.13.0 - [Firebase/Analytics][I-ACS023008] To enable debug logging set the following application argument: -FIRAnalyticsDebugEnabled (see http://goo.gl/RfcP7r)
2020-05-11 10:05:31.926 [info][tid:main][RCTRootView.m:294] Running application contractorselogs ({
initialProps = {
};
rootTag = 1;
})
2020-05-11 10:05:31.967502-0500 contractorselogs[1142:7696193] [Crashlytics] Version 3.14.0 (144)
ℹ️-[TSDBLogger db_delete] maxAge: 259200
2020-05-11 10:05:32.021878-0500 contractorselogs[1142:7697361] ℹ️-[TSDBLogger db_delete] maxAge: 259200
ℹ️-[TSLocationManager init]
╔═════════════════════════════════════════════
║ TSLocationManager (build 344)
╠══════════════════════════════════════════════
{
activityRecognitionInterval = 10000;
activityType = 1;
authorization = {
};
autoSync = 1;
autoSyncThreshold = 0;
batchSync = 0;
debug = 0;
desiredAccuracy = "-1";
desiredOdometerAccuracy = 100;
didLaunchInBackground = 0;
disableAutoSyncOnCellular = 0;
disableElasticity = 0;
disableLocationAuthorizationAlert = 0;
disableMotionActivityUpdates = 0;
disableStopDetection = 0;
distanceFilter = 10;
elasticityMultiplier = 1;
enableTimestampMeta = 0;
enabled = 1;
encrypt =
|
28% || 0.60 | 0.47 | -22% | 0.0000 |
+| TPC-H 04 || 1387.1 | 1263.2 | -9% || 0.72 | 0.79 | +10% | 0.0000 |
| TPC-H 05 || 3148.3 | 3016.8 | -4% || 0.32 | 0.33 | +4% | 0.0003 |
| TPC-H 06 || 175.0 | 182.3 | +4% || 5.71 | 5.49 | -4% | 0.0000 |
-| TPC-H 07 || 779.8 | 913.6 | +17% || 1.28 | 1.09 | -15% | 0.0000 |
| TPC-H 08 || 696.4 | 708.1 | +2% || 1.44 | 1.41 | -2% | 0.0000 |
+| TPC-H 09 || 5820.6 | 5494.9 | -6% || 0.17 | 0.18 | +6% | 0.0000 |
-| TPC-H 10 || 2254.1 | 2993.4 | +33% || 0.44 | 0.33 | -25% | 0.0000 |
-| TPC-H 11 || 70.2 | 81.8 | +17% || 14.24 | 12.22 | -14% | 0.0000 |
| TPC-H 12 || 975.4 | 999.0 | +2% || 1.03 | 1.00 | -2% | 0.0000 |
+| TPC-H 13 || 6252.6 | 5359.1 | -14% || 0.16 | 0.19 | +17% | 0.0000 |
+| TPC-H 14 || 499.2 | 461.0 | -8% || 2.00 | 2.17 | +8% | 0.0000 |
-| TPC-H 15 || 236.8 | 265.5 | +12% || 4.22 | 3.77 | -11% | 0.0000 |
-| TPC-H 16 || 577.0 | 631.4 | +9% || 1.73 | 1.58 | -9% | 0.0000 |
| TPC-H 17 || 218.9 | 226.0 | +3% || 4.57 | 4.43 | -3% | 0.0000 |
-| TPC-H 18 || 1608.9 | 1875.8 | +17% || 0.62 | 0.53 | -14% | 0.0000 |
-| TPC-H 19 || 251.1 | 280.8 | +12% || 3.98 | 3.56 | -11% | 0.0000 |
| TPC-H 20
|
input: Please answer the following: Facts: - teama: city origin - source: -lsb- -rsb- -lsb- http://www.rugbyleagueproject.org/players/john_skandalis/summary.html rlp -rsb- http://www.dailytelegraph.com.au/sport/nrl/player?playerid=100400&teamid=3400 league central -rsb- -lsb- http://www.nrlstats.com.au/archive/players.cfm?playerid=307 nrl stats - points: 12 60 20 4 - birth date: 16 june 1976 - weight: 113 kg stlb on - fieldgoals: 0 0 0 0 - pointsa: 0 - appearances: 64 166 46 19 - triesa: 1 - caption: skandalis in 2005 - height: 190 cm ftin on - tries: 3 15 5 1 - retired: yes - name: john skandalis - birth place: sydney , new south wales , australia - position: pr - appearancesa: 4 - image: john skandalis -lrb- 23 april 2005 -rrb- . jpg - updated: 20 october 2010 - goalsa: 0 - yearastart: 2002 - yearend: 99 06 08 10 - yearstart: 1996 2000 2007 2009 - nickname: skando - yearaend: 06 - fieldgoalsa: 0 - club: wests magpies wests tigers huddersfield giants wests tigers - goals: 0 0 0 0 Based on these bullet points, write a short biography describing the life of john skandalis .
++++++++++
output: john skandalis -lrb- born 16 june 1976 in sydney , new south wales -rrb- is an australian former professional rugby league footballer of the 1990s and 2000s for the wests tigers of the nrl .he played previously for the western suburbs magpies and huddersfield giants .skandalis primarily played at prop forward .
input: Please answer the following: Facts: - profession: economist - website: http://www.balashov.com.ua/ - residence: kyiv , ukraine - party: 5.10 - office: people 's deputy of ukraine - occupation: politician , businessman - term end: april , 2002 - term start: may , 1998 - birth date: 20 february 1961 - name: gennady balashov Геннадий Балашов - nationality: ukraine - image: gennady balashov.jpg - birth place: dnipropetrovsk , ukraine - alma mater: dnipropetrovsk national university Based on these bullet points, write a short biography describing the life of gennady balashov .
++++++++++
output: gennady victorovich balashov -lrb- , born 20 february 1961 , dnepropetrovsk , dnipropetrovsk oblast , ussr -rrb- is a ukrainian businessman , blogger , politician and leader of 5.10 political party .he is an advocate of a radical reform of ukraine 's tax system .specifically , he is pushing for the introduction of only two taxes : 5 % sales tax and 10 % payroll tax .during euromaidan events , balashov sided with the protesters and was actively campaigning for his ideas , calling for the introduction of 5.10 system .at the beginning of march , as the crisis in crimea was deepening , he travelled to simferopol , but was kidnapped by allegedly local self-defense forces .upon his release and return to kiev , balashov said that he was kidnapped , robbed and was being beaten for eight hours by local bandits .
input: Please answer the following: Facts: - years active: 1991 -- present - past members: bleu engelsman kevin dixon brad rozier john massel matt van brandon vandyke power mike mckendrick mark petz craig de
|
Some of this I agree with it and it hits the nail on the head of my complaints with the Agarthans, albeit more so their treatment than anything. Same thing with the Nabateans. I'll be frank and say they're two of my favorite aspects of Three Houses and they've made Fodlan one of my favorite settings in Fire Emblem media. What angers me though is how criminally they're both underutilized because the bulk of all attention is given to Edelgard, Dimitri, Claude, and their respective nations. Not even a dedicated route to Fodlan's ancient past which is still influencing the continent millennia afterward. Three Hopes gave the Agarthans just a little more spotlight with Arval/Epimenides , but so little was done with them and likewise the Nabateans. It's frustrating since I wanted to see that conflict more so than the bog-standard human conflict that ended up being the main focus.
Other parts of this, I highly disagree with so...(inhales)
I'm genuinely tired of seeing this constant thing of "Agarthans ruin the game's morally grey atmosphere". It's undeniable that life can exist in shades of gray, but sometimes you have people that are just genuinely shades of black. Hatred can do that, and it can stick with people for *ages*. That's the Agarthans are, they've spent presumably thousands of years stewing in their hatred instead of moving on from it. Real-life people can do that (minus the fantasy elements) and in that sense the Agarthans are realistic. People invest way too heavily in this notion of moral greyness, especially in this game that tries to paint everyone in that light. Not every side in a story or conflict has to have some goodness or shred of decency in them.
Sothis shared her knowledge with the first Agarthans, and in that you can draw parallels to real life. When you share knowledge or technology with someone else, it is always be a possibility they will surpass you, and with that will come a sense of superiority. For the Agarthans, this led to them presumably trying to nuke her and her children...and then blamed her when she fought back.
This is also where I take issue with the whole "delete" thing. Without the Agarthans attempting to kill Sothis, she'd have never tired herself out fixing Fodlan. If she had still been around, it's highly doubtful Nemesis could have done as you said in waging war in the Nabateans. With what weapons would he have waged war on the Nabateans, an entire race of superpowered dragons, with? In that same Nintendo Dream Interview, I believe it was stated the Agarthans were the first ones to create Relic Weapons, and after doing so they wanted more power. They're the ones Nemesis presumably went to in order to get the weapons and blood needed to create Crests.
I'd also say the whole "Sothis and the Nabateans weren't benevolent rulers" thing might be dented since the Ten Elites were rewritten to be allies of Seiros, Sothis' champion and divine spokesman. That would have made them *less* appealing to their supposed anti-Sothis subjects. Given Maurice's words in Marianne's paralogue, one also has to consider what the developers meant when they said Nemesis and his Elites were considered "heroes". Hero doesn't always mean morally righteous or likable, it can just mean strong or powerful, moral or personal beliefs notwithstanding.
Regarding Epimenides' disagreement with Thales, it's more for his brand of leadership. In his paralogue, he speaks derogatorily of Fodlan's human populace, Nemesis included, and Sothis' children. If Thales' personal class is anything to go by, he's no one's rebelling against him because it wouldn't end well for them.
Onto Thales himself. Some clues can be gleaned from his final words in SS/VW. This, I'll admit, requires some thinking. He and his people have spent the last thousands of years living underground while the humans above live in the light (and worship the goddess who "ruined" them). This is envy and hatred, the haves and the have nots. Thales and the Agarthans want them to suffer. **Hating someone isn't just about not liking them, it's wanting them to suffer physically and emotionally**. The Agarthans do all the above because, again, they're jealous and angry the people of Fodlan get to enjoy something they can't. Not to mention they view them as beneath them.
European colonizers viewed Native Americans as being uncivilized and savage because of their supposed technological, societal, and religious superiority. The American South adamantly refused to acknowledge blacks as being equal. This is how the Agarthans view the current people of Fodlan. They're the remnants of
|
Karjiang in China (卡热疆峰) is presented in this virtual aerial flyover video. Mountains of China https://www.youtube.com/watch?v=mr0OOut8K-M&list=PLZH_w6_UJWQJ.. This form collects your contact information, see our Privacy Policy for information on how we use this data. I consent to allow my contact information to be collected. Multiple groups have tried—and failed—to climb Karjiang I in Tibet and its 23,691-foot peak. The highly technical nature of the climb coupled with avalanche-prone landscapes and ever-shifting.
« karjiang holding as - org 924639768. klein consulting as - org 924641495.
Karjiang is a mountain in Tibet, located near the Bhutan- China border. The highest peak of the Karjiang group is Karjiang I or Karjiang South, with an elevation of ; it remains unclimbed. Other peaks include Karjiang North (7196 m), Karjiang II/Central (7045 m), Karjiang III or Taptol Kangri (6820 m) and the top of the north-eastern shoulder (6400 m) The group climbed nearby Karjiang III (6,820m/22,375ft) before bad weather forced the team to descend and then eventually depart the mountain altogether. In 2010, American mountaineers Joe Puryear and David Gottlieb attempted to climb Karjiang I. However, they failed to get the necessary permit and made an attempt on Labuche Kang III instead Dette er en liste over verdens 117 høyeste fjell, som alle ligger i Asia.Bare topper som er ansett som selvstendige fjell er tatt med i rangeringen, ikke sekundære topper som er del av en større toppformasjon This is the list of top 7 unclimbed mountains in the world 7. Mount Kailash, 6,638mtr 6. Janhukut, 6,805mtr 5. Karjiang south, 7,221mtr 4. Labuche Kang 3, 7,..
The latest Tweets from Keith (@karjiang) Skip to content. Home Home Home, current page. Moments Moments Moments, current page. Search query Search Twitter. Saved searches. Remove; In this conversation. Verified account Protected Tweets @ Suggested users Verified account.
Karjiang Holding AS har besøksadresse C/o Hans-kristian Koren Edvard Griegs Allé 17, 0479 Oslo. Selskapet ble stiftet i 2020 og er registrert som AS under bransjen Uoppgitt. Bedriftens ledelse / styre er: Hans-Kristian Seem Koren (Styreleder The 5 highest unclimbed mountains in the world No. 5 - Karjiang, Tibet, 7,221 meters Located in Tibet, near the border of Bhutan, this is a very technical mountain and prone to avalanches. There's plenty of steep rock and ice to get through before.. View from Kaluxung South Summit (6536m) In the background, Phuma Yumtso Lake and Bhutan Himalaya with Karjiang (7221m, currently still unclimbed) and Kula Kangri (7538m) (Central Tibet - Karo La Group - 26 October 2007 This map was created using GPS Visualizer's do-it-yourself geographic utilities. Please wait while the map loads..
template module for karjiang Last updated 6 years ago by hkkoren . MIT · Repository · Bugs · Original npm · Tarball · package.jso Karjiang, Tibet Karjiang South is the highest peak of the Karjiang, elevated at 7,221m. Many attempts, albeit unsuccessful, were made to climb the mountain which is known to be prone to severe.
Used in the following map: Map 322962 ( Peaks 7000-7999m) ©2020 CommunityWalk Inc. Home - My Maps - Professional Use - About - Blog - Forum - Contact - Terms of Servic Kar Jiang Puteh is on Facebook. Join Facebook to connect with Kar Jiang Puteh and others you may know. Facebook gives people the power to share and makes the world more open and connected Diese Seite zeigt die Sonnenaufgang und Sonnenuntergang Zeiten in Karjiang I, Lhozhag, Shannan, China, einschließlich der schönen Sonnenaufgang oder Sonnenuntergang Fotos, lokale aktuelle Zeit, Zeitzone , Läng
|
}
if (num24 > 15)
{
num24 = 15;
}
bits.Pack(num24, 4);
num23 = (float)Math.Exp(0.27027027027027023 * (double)num24 - 2.0);
float num25 = num23 * (float)Math.Sqrt((double)(1f + num12)) / num20;
float num26 = 1f / num25;
for (int i = 0; i < this.subframeSize; i++)
{
this.excBuf[num15 + i] = 0f;
}
this.excBuf[num15] = 1f;
Filters.Syn_percep_zero(this.excBuf, num15, this.interp_qlpc, this.bw_lpc1, this.bw_lpc2, array2, this.subframeSize, this.lpcSize);
for (int i = 0; i < this.subframeSize; i++)
{
this.excBuf[num15 + i] = 0f;
}
for (int i = 0; i < this.lpcSize; i++)
{
array[i] = this.mem_sp[i];
}
Filters.Iir_mem2(this.excBuf, num15, this.interp_qlpc, this.excBuf, num15, this.subframeSize, this.lpcSize, array);
for (int i = 0; i < this.lpcSize; i++)
{
array[i] = this.mem_sw[i];
}
Filters.Filter_mem2(this.excBuf, num15, this.bw_lpc1, this.bw_lpc2, this.res, num16, this.subframeSize, this.lpcSize, array, 0);
for (int i = 0; i < this.lpcSize; i++)
{
array[i] = this.mem_sw[i];
}
Filters.Filter_mem2(this.high, num14, this.bw_lpc1, this.bw_lpc2, this.swBuf, num17, this.subframeSize, this.lpcSize, array, 0);
for (int i = 0; i < this.subframeSize; i++)
{
this.target[i] = this.swBuf[num17 + i] - this.res[num16 + i];
}
for (int i = 0; i < this.subframeSize; i++)
{
this.excBuf[num15 + i] = 0f;
}
for (int i = 0; i < this.subframeSize; i++)
{
this.target[i] *= num26;
}
for (int i = 0; i < this.subframeSize; i++)
{
array3[i] = 0f;
}
this.submodes[this.submodeID].Innovation.Quantify(this.target, this.interp_qlpc, this.bw_lpc1, this.bw_lpc2, this.lpcSize, this.subframeSize, array3, 0, array2, bits, this.complexity + 1 >> 1);
for (int i = 0; i < this.subframeSize; i++)
{
this.excBuf[num15 + i] += array3[i] * num25;
}
if (this.submodes[this.submodeID].DoubleCodebook != 0)
{
float[] array4 = new float[this.subframeSize];
for (int i = 0; i < this.subframeSize; i++)
{
array4[i] = 0f;
}
for (int i = 0; i < this.subframeSize; i++)
{
this.target[i] *= 2.5f;
}
this.submodes[this.submodeID].Innovation.Quantify(this.target, this.interp_qlpc, this.bw_lpc1, this.bw_lpc2, this.lpcSize, this.subframeSize, array4, 0, array2, bits, this.complexity + 1 >> 1);
for (int i = 0; i < this.subframeSize; i++)
{
array4[i] *= (float)((double)num25 * 0.4);
}
for (int i = 0; i < this.subframeSize; i++)
{
this.excBuf[num
|
Id.GetOrCompute("Enemy Entity ID"), MyStringId.GetOrCompute("If set, only target an enemy with this entity ID"));
enemyId.Getter = block => new StringBuilder(m_enemyId.ToString());
enemyId.Setter = (block, value) => {
if (!long.TryParse(value.ToString(), out m_enemyId))
m_enemyId = -1L;
};
controls.Add(enemyId);
}
if (m_responseListbox == null)
{
m_responseListbox = new MyTerminalControlListbox<MyShipController>("Responses", MyStringId.GetOrCompute("Responses"), MyStringId.NullOrEmpty);
m_responseListbox.ListContent = ListContent;
m_responseListbox.ItemSelected = ItemSelected;
}
controls.Add(m_responseListbox);
if (!m_addingResponse)
{
controls.Add(new MyTerminalControlButton<MyShipController>("AddResponse", MyStringId.GetOrCompute("Add Response"), MyStringId.NullOrEmpty, AddResponse));
controls.Add(new MyTerminalControlButton<MyShipController>("RemoveResponse", MyStringId.GetOrCompute("Remove Response"), MyStringId.NullOrEmpty, RemoveResponse));
controls.Add(new MyTerminalControlButton<MyShipController>("MoveResponseUp", MyStringId.GetOrCompute("Move Response Up"), MyStringId.NullOrEmpty, MoveResponseUp));
controls.Add(new MyTerminalControlButton<MyShipController>("MoveResponseDown", MyStringId.GetOrCompute("Move Response Down"), MyStringId.NullOrEmpty, MoveResponseDown));
}
}
protected override AutopilotActionList.AutopilotAction Parse(VRage.Game.ModAPI.IMyCubeBlock autopilot, string command, out string message)
{
if (!ServerSettings.GetSetting<bool>(ServerSettings.SettingName.bAllowWeaponControl))
{
message = "Weapon control is disabled in settings";
return null;
}
string[] split = command.RemoveWhitespace().Split(new[] { ',' }, StringSplitOptions.RemoveEmptyEntries);
m_range = 0f;
long entityId = 0L;
m_activeResponses.Clear();
foreach (string s in split)
{
if (s.Equals("off", StringComparison.InvariantCultureIgnoreCase))
{
m_activeResponses.Add(EnemyFinder.Response.None);
message = null;
return mover => mover.NavSet.Settings_Commands.EnemyFinder = null;
}
float range;
if (PrettySI.TryParse(s, out range))
{
m_range = range;
continue;
}
if (s.StartsWith("id", StringComparison.InvariantCultureIgnoreCase))
{
if (s.Length < 3)
{
message = "Could not get id from " + s;
return null;
}
string idStr = s.Substring(2, s.Length - 2);
IMyEntity entity;
if (!long.TryParse(idStr, out entityId) || !MyAPIGateway.Entities.TryGetEntityById(entityId, out entity))
{
message = "Not an id: " + idStr;
return null;
}
else
m_enemyId = entityId;
continue;
}
string resStr = s.Replace('-', '_');
EnemyFinder.Response r;
if (!Enum.TryParse<EnemyFinder.Response>(resStr, true, out r))
{
message = "Not a response: " + resStr;
return null;
}
else
m_activeResponses.Add(r);
}
if (m_activeResponses.Count == 0)
{
message = "No responses";
return null;
}
message = null;
return mover => {
if (mover.NavSet.Settings_Commands.EnemyFinder == null)
mover.NavSet.Settings_Commands.EnemyFinder = new EnemyFinder(mover, mover.NavSet, entityId);
mover.NavSet.Settings_Commands.EnemyFinder.AddResponses(m_range, m_activeResponses);
};
}
protected override string TermToString()
{
if (m_activeResponses.Contains(EnemyFinder.Response.None))
return Identifier + " Off";
string result = Identifier + ' ';
if (m_range > 0f)
result += PrettySI.makePretty(m_range) + ',';
result += string.Join(",", m_activeResponses);
if (m_enemyId != 0L)
result += ",ID" + m_enemyId;
return result;
}
private void ListContent(IMyTerminalBlock autopilot, List<MyTerminalControlListBoxItem> items, List<MyTerminalControlListBoxItem> selected)
{
if (m_addingResponse)
{
foreach (EnemyFinder.Response response in m_allResponses.Select(item => (EnemyFinder.Response)item.UserData
|
},RandomNumbers.Xorshifts.Xoroshiro128Plus},Array{Array{Float64,1},1},true},##109#111,##110#112,Void,UniformScaling{Int64},Void},StochasticDiffEq.SRIW1,StochasticDiffEq.LinearInterpolationData{Array{Array{Float64,1},1},Array{Float64,1}}},StochasticDiffEq.SRIW1ConstantCache,Void,##109#111,##110#112,StochasticDiffEq.SDEOptions{Float64,Float64,DiffEqBase.#ODE_DEFAULT_NORM,DiffEqBase.CallbackSet{Tuple{},Tuple{}},DiffEqBase.#ODE_DEFAULT_ISOUTOFDOMAIN,DiffEqBase.#ODE_DEFAULT_PROG_MESSAGE,DiffEqBase.#ODE_DEFAULT_UNSTABLE_CHECK,DataStructures.BinaryHeap{Float64,DataStructures.LessThan},Void,Void,Float64,Float64,Float64,Float64},DiffEqNoiseProcess.NoiseWrapper{Float64,2,Float64,Array{Float64,1},Array{Float64,1},DiffEqNoiseProcess.NoiseProcess{Float64,2,Float64,Array{Float64,1},Array{Float64,1},Array{Array{Float64,1},1},DiffEqNoiseProcess.OrnsteinUhlenbeck!{Float64,Float64,Float64},Void,true,DataStructures.Stack{Tuple{Float64,Array{Float64,1},Array{Float64,1}}},ResettableStacks.ResettableStack{Tuple{Float64,Array{Float64,1},Array{Float64,1}}},DiffEqNoiseProcess.RSWM{:RSwM3,Float64},RandomNumbers.Xorshifts.Xoroshiro128Plus},Array{Array{Float64,1},1},true}}) at ~/.julia/v0.6/StochasticDiffEq/src/solve.jl:391
[4] #solve#58(::Array{Any,1}, ::Function, ::DiffEqBase.SDEProblem{Array{Float64,1},Float64,false,DiffEqNoiseProcess.NoiseWrapper{Float64,2,Float64,Array{Float64,1},Array{Float64,1},DiffEqNoiseProcess.NoiseProcess{Float64,2,Float64,Array{Float64,1},Array{Float64,1},Array{Array{Float64,1},1},DiffEqNoiseProcess.OrnsteinUhlenbeck!{Float64,Float64,Float64},Void,true,DataStructures.Stack{Tuple{Float64,Array{Float64,1},Array{Float64,1}}},ResettableStacks.ResettableStack{Tuple{Float64,Array{Float64,1},Array{Float64,1}}},DiffEqNoiseProcess.RSWM{:RSwM3,Float64},RandomNumbers.Xorshifts.Xoroshiro128Plus},Array{Array{Float64,1},1},true},##109#111,##110#112,Void,UniformScaling{Int64},Void}, ::StochasticDiffEq.SRIW1, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}, ::Type{Val{true}}) at ~/.julia/v0.6/StochasticDiffEq/src/solve.jl:7
[5] (::DiffEqBase.#kw##solve)(::Array{Any,1}, ::DiffEqBase.#solve, ::DiffEqBase.SDEProblem{Array{Float64,1},Float64,false,DiffEqNoiseProcess.NoiseWrapper{Float64,2,Float64,Array{Float64,1},Array{Float64,1},DiffEqNoiseProcess.NoiseProcess{Float64,2,Float64,Array{Float64,1},Array{Float64,1},Array{Array{Float64,1},1},DiffEqNoiseProcess.OrnsteinUhlenbeck!{Float64,Float64,Float64},Void,true,DataStructures.Stack{Tuple{Float64,Array{Float64,1},Array{Float64,1}}},ResettableStacks.ResettableStack{Tuple{Float64,Array{Float64,1},Array{Float64,1}}},DiffEqNoise
|
<gh_stars>0
using System.Diagnostics.CodeAnalysis;
using System.Linq;
using ManagedDns.Internal.Engines;
using ManagedDns.Internal.Factory;
using ManagedDns.Public;
using ManagedDns.Public.Models.RData;
using ManagedDns.Tests.TestResources;
using Xunit;
namespace ManagedDns.Tests.Internals.FactoryTests
{
[SuppressMessage("ReSharper", "PossibleNullReferenceException")]
public class RDataFactoryTests
{
[Fact]
public void FactoryTxtRecordTest()
{
var rec = RDataFactory.FactoryRecord(RecordType.TxtRecord, new RawByteParser(RDataBytes.TxtRdata().ToArray(), 0));
Assert.Equal("v=spf1 include:_spf.google.com ip4:172.16.31.10/31 ip4:192.168.3.11/31 ~all", (rec as Internal.Model.RData.TxtRecord).Text);
var rec2 = RDataFactory.FactoryRecordModel(RecordType.TxtRecord, RDataBytes.TxtRdata());
Assert.Equal("v=spf1 include:_spf.google.com ip4:172.16.31.10/31 ip4:192.168.3.11/31 ~all", (rec2 as TxtRecord).Text);
}
[Fact]
public void FactoryAaaaRecordTest()
{
var rec = RDataFactory.FactoryRecord(RecordType.AaaaRecord, new RawByteParser(RDataBytes.AaaaRdata().ToArray(), 0));
Assert.Equal("fdf8:f53e:61e4::18", (rec as Internal.Model.RData.AaaaRecord).Address.ToString());
var rec2 = RDataFactory.FactoryRecordModel(RecordType.AaaaRecord, RDataBytes.AaaaRdata());
Assert.Equal("fdf8:f53e:61e4::18", (rec2 as AaaaRecord).Address.ToString());
}
[Fact]
public void FactoryARecordTest()
{
var rec = RDataFactory.FactoryRecord(RecordType.ARecord, new RawByteParser(RDataBytes.ARData().ToArray(), 0));
Assert.Equal("172.16.58.3", (rec as Internal.Model.RData.ARecord).Address.ToString());
var rec2 = RDataFactory.FactoryRecordModel(RecordType.ARecord, RDataBytes.ARData());
Assert.Equal("172.16.58.3", (rec2 as ARecord).Address.ToString());
}
[Fact]
public void FactoryCNameRecordTest()
{
var rec = RDataFactory.FactoryRecord(RecordType.CNameRecord, new RawByteParser(RDataBytes.CNameRData().ToArray(), 0));
Assert.Equal("fd-fp3.wg1.b.", (rec as Internal.Model.RData.CNameRecord).CName);
var rec2 = RDataFactory.FactoryRecordModel(RecordType.CNameRecord, RDataBytes.CNameRData());
Assert.Equal("fd-fp3.wg1.b.", (rec2 as CNameRecord).CName);
}
[Fact]
public void FactoryMxRecordTest()
{
var rec = RDataFactory.FactoryRecord(RecordType.MxRecord, new RawByteParser(RDataBytes.MxRData().ToArray(), 0));
Assert.Equal("mta7.am0.yahoodns.net.", (rec as Internal.Model.RData.MxRecord).Exchanger);
Assert.Equal(1, (rec as Internal.Model.RData.MxRecord).Preference);
var rec2 = RDataFactory.FactoryRecordModel(RecordType.MxRecord, RDataBytes.MxRData());
Assert.Equal("mta7.am0.yahoodns.net.", (rec2 as MxRecord).Exchanger);
Assert.Equal(1, (rec2 as MxRecord).Preference);
}
[Fact]
public void FactoryNsRecordTest()
{
var rec = RDataFactory.FactoryRecord(RecordType.NsRecord, new RawByteParser(RDataBytes.NsRData().ToArray(), 0));
Assert.Equal("ns5.", (rec as Internal.Model.RData.NsRecord).NsDomainName);
var rec2 = RDataFactory.FactoryRecordModel(RecordType.NsRecord, RDataBytes.NsRData());
Assert.Equal("ns5.", (rec2 as NsRecord).NameServerDomainName);
}
[Fact]
public void FactoryPtrRecordTest()
{
var rec = RDataFactory.FactoryRecord(RecordType.PtrRecord, new RawByteParser(RDataBytes.PtrRData().ToArray(), 0));
Assert.Equal("dfw06s40-in-f
|
recognizing there is a problem is a great start! as a relative of hoarders (grandparents, parent, other relatives), i would also recommend breaking down the process into really manageable, ACHIEVEABLE parts to be able to keep going. there will be times when you might lose motivation, but if you can start back up again and not make excuses and fall back into old habits, you will see results!
i noticed that a lot of the conversation in the comments focused on your son's behavior and also shifting some blame to other members of your family. i have no idea what the dynamics of your relationships with your family are, but here's what i've noticed with the relationship between my cousins and my aunt (their mother), who is a hoarder. i saw your comment about how you posted this after your son yelled at you late at night, so i wanted to give you some perspective from the children of a hoarder. i don't know your situation, so it might not apply to you.
i love my aunt and consider her a second mother, but here is the unvarnished description of her situation: my aunt doesn't truly recognize how her hoarding has affected my cousins and the extent on the control she has to have over every object in her house. as a coping mechanism, her family minimizes the hoarding and most times acts as if it is not there, but sporadically their frustration and anger erupts as my aunt.
to my aunt, when a family blows up at her, it is surprising and unfair because she has been "a little messy" but intends to "clean up." to her, the other family members don't do their share and are a large part of the problem. but the reality is that the family has given up on decluttering and keeping things neat and organized because she must have control over even their belongings that they no longer want. everything is piled so high, it is impossible to be neat. in their daily lives, they go along with the fairy tale that everything is ok because if they truly recognized the mess every single day, they all would be more depressed than they already are. they avoid talking about the mess with my aunt because she becomes angry and blames them for a lot of the mess. (also, who has the energy to argue just to try to throw away an old newspaper or ripped t-shirt?).
old, broken, soiled belongings pile up because my aunt "just needs to wash it" and "someone could still use it," but she never seems to have to time to deal with them "properly." piles and piles of my adult cousins' old clothes from when they were babies and kids (several feet tall - some stacked to the ceiling) are still in the house. my cousins have kept the hoarding a secret, so throughout elementary, middle and high school, none of their friends knew why they weren't allowed in the house. my cousin has dated his girlfriend for over 5 years and could only reveal the hoarding issue after 2 years. another cousin can only bear to visit a couple of times a year and never stays at the house because he felt so abandoned when he came home for the summer after his first year of college and no one cleaned up to make any space for him or his relatively small amount of stuff.
my cousins could only keep a few things in their own bedroom closets because they were mostly full of old things my aunt couldn't get rid of. they have to shuffle sideways through the house because there are items piled up in hallways. the "stuff" slowly, steadily grows and seems to push out the family members who are supposed to be the most important people in my aunt's life. living with a hoarder like my aunt is hard because you have very little control over your belongings and space and you are constantly participating in a farce that the situation isn't that bad or is normal. you usually manage to deal with it, but sometimes the frustration builds and you blow up in a way that is not mature or helpful, but understandable, and you end up blindsiding your hoarder parent.
your house may never be picture perfect, but if you can get to a point where you have changed your habits and shown your sons and husband that your are making an effort to change, then hopefully they will follow your lead and also take more responsibility. maybe you can reach a point where the house is clean enough where your boys feel comfortable and can invite friends over on a whim. you need your sons' help to keep things up, but if it is mostly your stuff or things that you won't let them get rid of, then you have to be the parent and role model to your sons before you have the leverage to actually tell them to change as well. if your husband is messy as well, hopefully he will come around and realize he must also take responsibility for the house (in spite of his work hours) because it is the place where you are raising your sons,
|
best chance at defeating this thing. I Trumped him and pulled him through – I’m glad I did, because he was in pretty rough shape. Fiona and Corwin weren’t doing the greatest job of keeping him safe. The four of us hashed out a plan (leaving more to chance then I’d care to admit) and the three of us headed back to the house, leaving Artemis behind again. I did establish a Trump connection with him, so that he would know what was going on.
Back down in the basement, Edmund did a spectacular job on convincing the Pattern that Brand had changed his plans and needed the pattern to deactivate. As soon as it shut down, turning into a 2 dimensional Pattern on the floor, another shock wave rippled out. I checked with Artemis who said he seemed to be back to normal and that the energy creature was gone. I pulled him through to join us in the basement. With the Pattern inert and Artemis back to normal, the last piece of the puzzle seemed to be figuring out how to destroy the thing. Edmund suggested maybe studying it, but I think we all agreed it was too dangerous to keep around. Maeve promptly marched forward, cut her arm and bled across a part of the Pattern, which damaged that part. Unsure in this situation, I Trumped Uncle Corwin who I assumed would be able to help us. I pulled him and Aunt Fiona through and Corwin suggested that the rest of us bleed on the pattern as well... sadistic, but okay. That done, Corwin deemed it destroyed enough and we returned to the theater.
Turns out when the Pattern deactivated it brought a quick end to what was left of the three fights. I’m not sure anyone else realizes how much Edmund accomplished today; I’m not sure he realizes it himself. The rest of the family returned to Amber while Maeve, Artemis and I returned to the house to see if we could figure out what activated the Pattern in the first place. Our search came up with absolutely nothing and we decided to move on.
Maeve took this opportunity to bring up some concerns she was having about Edmund. Apparently, everyone is concerned with Edmund being as evil as his father. I bristled slightly at the unfair assumption. If you think someone is evil, of course their actions are going to seem sinister. I know Edmund. He is not his father. As I stood there defending him, the memory of him in the Corridor of Mirrors floated to the front of my mind. No, I wouldn’t make the same assumptions as everyone else. I don’t even know what to believe about that damned corridor, after all it showed me Rowan and he’s not even alive.
Maeve wanted to return to her personal Shadow to work on some spells or something. She invited us both to join her, but I felt the case of Brand’s Trump cards in my pocket and I knew I couldn’t wait any longer. I let Maeve and Artemis go off together before pulling out Bleys’ Trump. I made arrangements to meet him in the library an hour later. Having some time to spare, I pulled out Edmund’s Trump and had him pull me through. We chatted for a bit; I congratulated him again on his victory today, encouraged him to give people time to accept him, and offered to spend time showing him around and getting to know the family. He asked questions about how I was, about the prophecy and how things were going. For some reason everyone seems to think that this was one of the three catastrophes we are supposed to prevent, but I’m not so sure. Something doesn’t feel right about that... or maybe I’m just getting a bit landsick.
I promised to meet Edmund for dinner, and then made my way to see Uncle Bleys. I arrived at the Library a bit early and paced around the room with a pit in my stomach; I knew how this was going to go, but I also knew that I didn’t have a better option. Bleys arrived and we had a drink while discussing Edmund and his concerns with being accepted here in Amber. He agreed to give Edmund the benefit of the doubt and to try to help him.
With that done, I pulled out Brand’s Trumps and passed them to Bleys. He looked through the cards, then set them in his lap. And with that, I knew that I was never going to get them back. I do think I managed to convince him not to destroy them. Honestly, part of me is glad not to be responsible for those anymore. I thanked Bleys for his help and left to meet Edmund for dinner. We spent the better part of an hour wandering around; Edmund pretty much understood how the place worked having grown up in Rebma.
I left Edmund at his rooms before returning to my own for the night. Finally alone, I took a moment to attend to my sore leg. Just a sprain with some swelling and the most colorful bruise I think I’ve ever seen;
|
that anyone can pick up your passport/visa with the receipt you get upon drop-off, so safeguard your receipt and be nice to your Manhattan-based freinds (or those who will at least be in town for a conference as was the case in my situation) as they can help you out in a pinch.
Howeverm, it’s also worth being aware that the timeline they give you isn’t guaranteed. My visa was ready in 3 business days (paid a little extra for faster service), but I did see people in my trips to the consulate who were told to come back the next day to pick up their visas. Whether those people had waited out the entire suggested wait time or not, I do not know. Both times I’ve been to the consulate, my visa was completed in the time period I expected, but others appeared not to be so lucky.
Upon return to the US, we were arriving at JFK airport and departing again from LGA. This meant we would have to clear customs at JFK and then transfer airports. That wasn’t going to be an issue for us in terms of time (we had plenty of time on our hands), but I wanted to get through quickly since every minute less at immigration would be a minute longer spent in the Amex Centurion Lounge at LGA.
Once you download the app, you can add travelers by scanning each person’s passport information and then taking/uploading a picture of that person’s face. Just like with a customs form, you only need to do this once for a family traveling together (i.e. one family member can download the app and add the entire family). After entering and confirming each traveler’s information, it will ask you to choose your port of entry.
You also choose your airline in the app. Then, it will ask you the typical customs form questions about where you have been and what you are bringing into the country.
Once you have answered all of the questions, it will ask if you are ready to submit your answers. The app will remind you that you only have 4 hours from the time you submit the answers to go through immigration, so you’ll want to hold off on submitting the answers until you are at your port of entry. I submitted my answers while walking down the jet bridge on my way off my entry flight.
As it turned out, I definitely wasn’t going to need 4 hours. When we got to immigration, I looked for the signs for “Mobile Passport”. At our point of arrival (Terminal 7), that meant going past the normal entry points for the lines to enter the US and over to the Global Entry kiosks. While those with Global Entry were stopped at the kiosks to enter their information, I strolled right past those machines and up to the final line for passport control. You really need to look for the Mobile Passport signs and the logo, which, based on my two experiences with Mobile Passport (in Boston and NYC), are not very well posted. The Mobile Passport logo looks just like the app icon.
Be on the lookout for that image of an officer with an outsretched arm and either the words “Mobile passport” or the acronym “MPC”, which I assume means something like “Mobile Passport Control”. In both of my experiences, my eyes had to hunt for the sign and I had to resist the urge to get on the first line I saw with fear of the entire plane getting on line in front of me.
When we got to the dedicated line for Global Entry / Mobile Passport, there was a family ahead of us — but it was literally one family in front of us. When they were finished, we were up. The Mobile Passport app generates QR codes for each traveler. The agent simply had to scan the QR codes and we were on our way to the “nothing to declare” lane to enter the US (as we had nothing to declare). A second scan there of our QR codes and we were back in the USA, skipping the line and taking a grand total of under 5 minutes for re-entry at 10:30am on a weekday at JFK. The regular immigration line wasn’t outrageous at the time — but it was much longer than 1-family long. At a busy time, I could really see this being a huge benefit. The app is free and easy to use and doesn’t require any interview / set up apart from scanning your passports, uploading your photo, and selecting the basic information as shown above. It really couldn’t be much easier.
If you need to apply for a visa, do yourself a favor and read the directions. Failure to read the form instructions almost cost me a 6hr trip to New York for nothing. Luckily, a good samaritan came to my aid to save me from a wasted trip. Then, upon re-entry to the USA, I saved myself from waiting in a longer immigration & customs line thanks to the Mobile Passport app. Since that app is free, simple to use, and doesn’t require the prep work of Global Entry, it will become my preferred method
|
LDAP and SAML. We'll preserve the Git history for each code line. These will become github.com/silverstripe/silverstripe-ldap and github.com/silverstripe/silverstripe-saml respectively.
In terms of the work you've done already @username_0 do you think you could probably wait until the modules have been split, then make PRs against the new module that's relevant to your code. Does this seem sensible?
For the record we have done some testing (with #116) on the LDAP part of this module in SS4, but the SAML part has only been theoretically upgraded, not tested in practice.
<issue_comment>username_0: Sure! Just ping me when ready and I'd be delighted to setup a PR when the repository has been setup. This alleviates some of the responsibility of setting up and maintaining both branches, particularly since I think this should live under SilverStripe's github org anyway.
Only question is (leading up to that PR): What do you base the `silverstripe-saml` repository on? I've already split *that* off and cleaned up some things which are SAML-only. It does look like a majority of the efforts put into #116 are LDAP related and (indeed) only superficially touching on SAML-specific functionality. I've reviewed my changes `3` -> `3-saml` (split) compared to that PR and I'm not seeing much overlap except for `SAMLController`, but I see some of the bugs I've fixed are retained and (I believe) can be merged in either ahead or after (i.e. either by you folks or merged/fixed by me). Here's a PR documenting my progress on the separation thus far: https://github.com/username_0/silverstripe-saml/pull/1/files
I think it _still might_ be fine if I were to try to migrate my work thus far into `master` and either base my PR's off that, or inversely, the new `silverstripe-saml` modules could be derived and then modified from there. Whatever is easier.
<issue_comment>username_3: We'd base it on whatever is in the master branch here (including #116) then iterate from there.
I think we should be able to merge in your changes for SAML into that repository when it's ready :)
<issue_comment>username_0: This should also be tagged with `affects/v3` as well, depending on the meaning and implementation of the split.
<issue_comment>username_3: I think we'll only be doing this for SS4.
Here are the modules with the code split up:
* https://github.com/silverstripe/silverstripe-ldap
* https://github.com/silverstripe/silverstripe-saml
The last part of this will be to convert the master branch in this repository into an AD recipe. I've also logged issues on each of those repos to go through and update the docs to ensure they're correct and relevant for each.
Both modules have been created based on this module's repository history, so PRs will fit nicely (ish) over the top.
@username_0 if you feel like PRs for your SAML work, they'd be greatly appreciated against the new SAML repository!
<issue_comment>username_3: Perhaps this isn't necessary after all. We might instead reset the master branch on this repo back to SS3 compatibility and leave it as it is with readme notices to the new SS4 repos.
Thoughts @username_0 @username_1 @username_4?
<issue_comment>username_4: Sounds fine to me. :D
<issue_comment>username_0: If we do this, we'll just need to setup a branch alias so folks using composer for this repo for SS4 will not be abandoned.
<issue_comment>username_0: Also @username_3 with what I'm seeing done here in silverstripe-saml, how do I submit a PR for a branch that doesn't exist (i.e. `3`)? If you branch off `master`, that's backwards from the original workflow, where `3` contains the older code that was then merged into `master` (to port over minor updates + bug fixes). Updating the newly `3` branch which would have to remove the v4 changes, meaning `3` can no longer be merged back into `master`, thereby inverting the workflow.
<issue_comment>username_0: So, from what I can tell, I can't really use [silverstripe-saml](https://github.com/silverstripe/silverstripe-saml) until we can figure out a solid way to make it SS3 compatible, otherwise we may have duplicated code (since I cannot have my SAML integrated with the LDAP dependencies, the point of this ticket).
<issue_comment>username_3: I'm not especially concerned by this at this stage
|
Disney Wish to be delayed?
Looks like the Wish may be delayed, worst case is the two unnamed Tritons getting cancelled.
It's likely based on the video released today by Meyer-Werft that until things can return to a normal work schedule (they're currently on shut down due to the virus just like everywhere else) that most of the ships under construction will be finished, then they'll re-evaluate the build schedule and discuss with the cruise lines with builds on order if they're still in need for that vessel and if not they'll move on to the next one etc... But their work flow is definitely going to be slower and probably reduced to 1 smaller ship and 1 bigger ship at a time. I expect that if DCL will want the Wish built they will probably see a launch 6 months to 1 year from their original target Inaugural cruise. The entire cruise industry will take a long look at their future and adjust their requirements and even need for new ships at this time. We could conceivably see DCL cancel either new builds #6 or #7 or both.
Income from Disney+ is their one bright spot and it certainly won't make up for months of income lost from all their big revenue sources.
I'm not familiar with how contracts with shipyards work but I would imagine that Disney is already committed to the Wish or at least would face costly penalties to withdraw since the steel has been cut.
This is pure speculation as well but the Magic and Wonder aren't exactly spring chickens anymore. As they begin recovery, it will be interesting to see what their futures are. I doubt we'll ever publicly know but it wouldn't surprise me if they end up retiring them earlier than planned if demand ends up being extremely weak. As in, maybe the original intent was to sail all 7 ships for several years but now they'll be looking at 4 or 5 simultaneously. Who knows, though.
To add to that, I absolutely think you're going to see the larger cruise lines get rid of their older ships immediately. Some of these other lines with 30-40-50 ships...they may only need 30 of the 40 and all of theirs already exist so if you need to cut capacity, ditching the clunkers might be a good idea.
before the virus the wish was coming out in 2022. Late 2021 but no guests until 2022. Target date was Jan 2022 for guests.
They might cancel the 3rd one, but not the 2nd one. The cost of a ship is such that you make at least two. The developing/designing cost is high, and therefore at least two are build of the same blueprints. (info from family who recently visited the Meyer Werft.
I was thinking they may cancel #2 since it was not originally planned, and they added it in after the announcement of the other two. It also helps spread out payments a little more with at least a year between the ships - especially if production is also stretched out.
Generally, 25-30 years for the line they were built for and then sold to a down market line and run 5-15 years more. Of course, the better shape the ship is kept it the longer it will stay in each of those levels.
Would it be more difficult for Disney since there's so much theming inside? I feel like a lot of that would be hard to cover up for a non-Disney line, plus you add in the sentimental factor for so much of their audience.
OMG. That’s sad but at the same time, I understand that if the money (and the demand) is not there, it’s not a good timing to order them.
EDIT: Let’s not forget that the economy is not great and the unemployment rate in North America is currently extremely high. If DCL keeps its prices high, people who used to be able to afford them might not be able to afford them anymore.
I thought the Wish was a different blueprint than the two others after it. Or, are all 3 the same blueprint?
I believe all 3 are the same class.
I am actually from the other mindset where I think they would continue building the new 3 ships.
First, the publicity is a big thing for new ships. It will drive business back to the brand. I cant tell you how many times I have seen people on here say "I'm just going to wait for the new ships for my next cruise"
Second, because Disney is a smaller brand, its not like NCL, RCCL, Celebrity, etc that have dozens of ships. 7 total is an entirely manageable amount. The reality of the Magic and Wonder retiring in the next few years is completely realistic. 98 and 99 is quite old for a ship even though they have been kept up immaculately. If anything major goes mechanically wrong, I would guess its more cost effective to just build
|
exception cref="ArgumentException"></exception>
public UUID(ReadOnlyMemory<byte> data)
: this(data, BitConverter.IsLittleEndian)
{
}
/// <summary>
/// Initializes an instance of <see cref="UUID"/>.
/// Assumes the <paramref name="guid"/> has platform byte order.
/// </summary>
/// <remarks>
/// <paramref name="guid"/> data is first copied to a new array.
/// </remarks>
/// <param name="guid"></param>
/// <exception cref="ArgumentException"></exception>
public UUID(Guid guid)
: this(guid.ToByteArray(), BitConverter.IsLittleEndian)
{
}
/// <summary>
/// Initialized an instance of <see cref="UUID"/>.
/// </summary>
/// <param name="hexString"></param>
/// <exception cref="ArgumentException"></exception>
public UUID(string hexString)
: this(Convert.FromHexString(hexString))
{
}
/// <summary>
/// Copies the <paramref name="source"/>, optionally swapping the endianess.
/// </summary>
/// <param name="source">The source <see cref="UUID"/></param>
/// <param name="endianSwap">Whether to perform an endian swap</param>
public UUID(UUID source, bool endianSwap)
: this(
endianSwap ? source.Data1.Swap() : source.Data1,
endianSwap ? source.Data2.Swap() : source.Data2,
endianSwap ? source.Data3.Swap() : source.Data3,
source.Data4,
source.Data5,
endianSwap ? !source.IsLittleEndian : source.IsLittleEndian)
{
}
/// <summary>
/// Copies the <paramref name="source"/>, with the same endianess.
/// </summary>
/// <param name="source">The source <see cref="UUID"/></param>
public UUID(UUID source)
: this(source, endianSwap: false)
{
}
/// <summary>
/// Implicit conversion to <see cref="Guid"/>. The resulting <see cref="Guid"/> will
/// have platform byte order.
/// </summary>
/// <param name="uuid"></param>
public static implicit operator Guid(UUID uuid)
{
if (uuid.IsLittleEndian != BitConverter.IsLittleEndian)
{
uuid = uuid.EndianSwap();
}
return new Guid(uuid._data[UUID_Range].Span);
}
/// <summary>
/// Implicit conversion from <see cref="Guid"/>.
/// </summary>
/// <param name="guid"></param>
public static implicit operator UUID(Guid guid) => new(guid);
/// <summary>
/// Implicit conversion to <see cref="ReadOnlyMemory{Byte}"/>.
/// </summary>
/// <param name="uuid"></param>
public static implicit operator ReadOnlyMemory<byte>(UUID uuid) => uuid._data[UUID_Range];
/// <inheritdoc />
public static bool operator ==(UUID left, UUID right) => left.Equals(right);
/// <inheritdoc />
public static bool operator !=(UUID left, UUID right) => !(left == right);
/// <summary>
/// Creates a new instance of the Nil <see cref="UUID"/>.
/// </summary>
public static UUID Nil => new();
/// <summary>
/// Creates a new instance of Version V(5) <see cref="UUID"/>.
/// </summary>
/// <param name="namespace">The namespace <see cref="UUID"/></param>
/// <param name="name">The name</param>
public static UUID V(UUID @namespace, string name)
{
// ensure network byte order (big endian)
if (@namespace.IsLittleEndian)
{
@namespace = @namespace.EndianSwap();
}
var namespacedName = @namespace._data
.ToArray()
.Concat(Encoding.Unicode.GetBytes(name))
.ToArray();
byte[] hash = SHA1.HashData(namespacedName);
byte[] result = new byte[16];
//Copy first 16-bytes of the hash into our future Guid result
Array.Copy(hash, result, 16);
//set high-nibble to 5 to indicate type 5
result[6] &= 0x0F;
result[6] |= 0x50;
//set upper two bits to 2 for native variant
result[8] &= 0x3F;
result[8] |= 0x80;
return new UUID(result, @namespace.IsLittleEndian);
}
/// <summary>
/// Creates a new instance of a Version IV(4) <see cref="UUID"/>.
/// </summary>
public static UUID IV()
{
byte[] result = Guid.NewGuid().ToByteArray();
//set high
|
.LOGOUT);
}
}
logoutAgent(agentId: string = this.agentId) {
return this._deps.evClient.logoutAgent(agentId);
}
beforeAgentLogout(callback: () => void) {
this._eventEmitter.on(loginStatus.LOGOUT_BEFORE, callback);
}
newReconnect(isBlock: boolean = true) {
this._deps.evClient.closeSocket();
const fn = this.loginAgent;
return isBlock ? this._deps.block.next(fn) : fn();
}
async authenticateWithToken({
rcAccessToken = this._deps.auth.accessToken,
tokenType = 'Bearer',
shouldEmitAuthSuccess = true,
}: AuthenticateWithTokenType = {}) {
console.log('authenticateWithToken', shouldEmitAuthSuccess);
try {
this._deps.evClient.initSDK();
const authenticateResponse = await this._deps.evClient.getAndHandleAuthenticateResponse(
rcAccessToken,
tokenType,
);
const agent = { ...this.agent, authenticateResponse };
// if (shouldEmitAuthSuccess && !this._authenticateResponseWatcher) {
// this._authenticateResponseWatcher = watch(
// this,
// () => this.agent?.authenticateResponse,
// (authenticateResponse) => {
// if (authenticateResponse) {
// this._emitAuthSuccess();
// this._authenticateResponseWatcher();
// this._authenticateResponseWatcher = null;
// }
// },
// );
// }
this.setAgent(agent);
this.setAuthSuccess();
if (shouldEmitAuthSuccess) {
this._emitAuthSuccess();
}
return authenticateResponse;
} catch (error) {
switch (error.type) {
case messageTypes.NO_AGENT:
this._deps.alert.warning({
message: error.type,
});
break;
case messageTypes.CONNECT_TIMEOUT:
case messageTypes.UNEXPECTED_AGENT:
this._deps.alert.danger({
message: error.type,
});
break;
default:
this._deps.alert.danger({
message: messageTypes.CONNECT_ERROR,
});
}
await this._logout();
}
}
async openSocketWithSelectedAgentId({
syncOtherTabs = false,
retryOpenSocket = false,
} = {}) {
console.log(
'openSocketWithSelectedAgentId',
syncOtherTabs,
retryOpenSocket,
);
try {
// TODO: here need check time when no message come back, that will block app.
const getAgentConfig = new Promise<EvAgentConfig>((resolve) => {
this._deps.evClient.on(EvCallbackTypes.LOGIN_PHASE_1, resolve);
});
const selectedAgentId = this.agentId;
if (!selectedAgentId) {
throw new EvTypeError({
type: messageTypes.NO_AGENT,
});
}
const openSocketResult = await this._deps.evClient.openSocket(
selectedAgentId,
);
// wait for socketOpened
// Because instance.socket Opened(); was performed after callback.
await sleep(0);
if (openSocketResult.error) {
console.log('retryOpenSocket~~', retryOpenSocket);
if (retryOpenSocket) {
const { access_token } = await this._deps.auth.refreshToken();
const authenticateRes = await this.authenticateWithToken({
rcAccessToken: access_token,
shouldEmitAuthSuccess: false,
});
if (!authenticateRes) return;
const openSocketRes: any = await this.openSocketWithSelectedAgentId({
syncOtherTabs,
});
return openSocketRes;
}
throw new EvTypeError({
type: messageTypes.OPEN_SOCKET_ERROR,
});
}
// TODO: implement multiple sync back drop
if (syncOtherTabs && this.tabManagerEnabled) {
this._deps.tabManager.send(tabManagerEvents.OPEN_SOCKET);
}
const agentConfig = await getAgentConfig;
const agent = { ...this.agent, agentConfig };
// if (!this._agentConfigWatcher) {
// this._agentConfigWatcher = watch(
// this,
// () => this.agent?.agentConfig,
// (agentConfig) => {
// if (agentConfig) {
// this._emitLoginSuccess();
// this._agentConfigWatcher();
// this._agentConfigWatcher = null;
// }
// },
// );
// }
this.setConnectionData({ agent, connected: true });
this.connecting = false;
this.setLoginSuccess();
this._emitLoginSuccess();
return agentConfig;
} catch (error) {
switch (error.type) {
case messageTypes.NO_AGENT:
this._deps.alert.warning({
message: error.type,
});
|
Well before AthleanX I was doing "my own thing" and I was stucked, after 2 years of doing AthleanX programs, I feel that I am in the best shape of my life. I am 31 years old and I feel stronger and much better overral than I felt when I was in high school.
Quick thoughts about the programs I have:
* AX1: The first program I did. When I did it I was able to workout only 4 days at week (the program is 5 days at week), so it took me longer to complete. Still, I saw and felt improvements. People even noticed that something was "different" in me. It creates a solid foundation for more advanced programs
* AX2: Not a big fan of the programming on this one. It changes the focus every week. Every week there is a challenge that you are supposed to pass before moving onto the next week.
* XERO: It teached me to NOT overestimate bodyweight training. This program will humble you and laugh at you when you are on the floor missing how happy you were with dumbells. It actually has the option to do "Xero-1 degree" or "Xero-2 degrees" which are more begginer friendly versions of the program. The lack of pull up bar exercises makes it difficult to target the back and biceps.
* XERO 2: 12 weeks instead of 6 weeks of Xero. Not necessarily harder than Xero (because Xero is HARD already). It actually improves on Xero the fact that it does uses a pull up bar.
* Max Shred: except for the "Black Widows" workouts, which seem to be designed to be super hard just for the sake super hard without being realistic; it is 10/10 program if you are looking to improve your conditioning without losing muscle.
* Max Size: German Volume Training, bro splits, concentric/eccentric focus and some conditioning here and there. Does not reinvents the wheel but it is efective, no doubt about that.
* Ultimate Arms: actually a full body program, with a bit more emphasis on the arms. Ironically, some of my favorite leg days routines are here!
* Beaxst: Loved how with "just" 3 days at week, you feel like you workout the whole week due to how intense the workouts are. I loved the corrective/cardio days. Some weird setups with ISO holds and the wild 4/8/16RM sets, but overral, one of the bests programs if you want to workout hard.
* Bane: One month (it is a bonus). The closest thing Jeff to a strongman program Jeff has put out there probably. A lot of carries, low reps sets, more emphasis on the traps than usual (because, you know, [Bane](https://64.media.tumblr.com/2dce578279b53f7bc4d6c133faa63378/tumblr_pei0edteHL1vlbdlu_1280.jpg))
* Old School Iron. In my opinion, this program makes Max Size seems redundant. It is actually 4 month (the first month focus on increasing strenght, the other 3 are 100% hypertrophy). If you have access to a full gym and you are looking for hypertrophy, this is your program.
* Dragon: My favorite month of Athlean X. Hheavily based on Bruce Lee training, which I am a big fan of. I would not be surprised if Jeff took a look at this [page](https://www.thebioneer.com/bruce-lee-training-routines/) and based Dragon on it. My main goal is to have a healthy body, and Dragon seems to be perfect for that goal. It has a dedicated running day, and a "Peripheral Heart Action Training" day which is unique on this program (from Athlean X programs) and I felt it is one of the most efficient burst workouts I´ve ever done. The other 3 days are full body training days, with dedicated grip and abs training incorporated, besides isometrics holds and of course the big lifts like squats, bench press..... It is indeed a very complete month IMO.
* Iron Courtain: One month (is a bonus). I havent dont this yet, but it looks like it is designed around olympic lifts.
* Armored abs: I havent done this yet, but there is not a lot of mistery here. All days are abs days, so it is actually supposed to be done as a complement of another program.
* All American Muscle: I have only access to the first month for now. It is a upper/lower/upper/lower, and 2 cardio days based on how a football or a basketball player would train for speed and agility. Drills
|
<issue_start><issue_comment>Title: No auto-completion for dynamic attributes
username_0: Jedi version: 0.11.0
Python version: 3.6.3
Using the following test script:
```python
class Foo(object):
def foo(self):
print('foo')
class Bar(object):
def __init__(self):
self._fooInstance = Foo()
def __dir__(self):
return sorted(set(self._fooInstance.__dir__()))
def __getattr__(self, name):
if hasattr(self._fooInstance, name):
return getattr(self._fooInstance, name)
raise AttributeError('Instance of {} has no attribute {:s}'.format(self.__class__.__name__, name))
bar = Bar()
bar.
```
Running the following jedi test:
```python
import jedi
with open('Test.py', 'r') as f:
src = ''.join(f.readlines())
print(jedi.Script(src).completions())
```
I get the completion for the `Bar` object as if it had it's default `__dir__`. But what I want is the actual output of the `Bar.__dir__`. Because of this, there is no auto-completion for dynamic attributes. In the last line of the test script, I would expect `foo` to be in the auto-completion suggestion, as it's a method of the `Foo` object inside of `Bar`, which handles the `__getattr__` calls to the outer object.
<issue_comment>username_1: I can see where you're going with this. I'm not sure if working with `__dir__` is the right idea, I'll just add it to the list of potential features in the future.
<issue_comment>username_2: FWIW, IPython's completions have used `__dir__` for a long time, and I don't recall having many problems resulting from it. I think this makes sense at least in `Interpreter` for objects that are already initialized; I imagine it's not practical to create instances solely to call their `__dir__` method.
<issue_comment>username_1: @username_2 Jedi uses `__dir__` when using the `Interpreter`. The question here is about statically analysing a `__dir__` function and adding the results to the completions.
The big problem is basically just that it's hard to do that not that I don't want to do that ;-)
<issue_comment>username_2: Ah, gotcha. We're experiencing a problem where we don't seem to get `__dir__` completions using `Interpreter`, and we thought this might be the same issue. I'll investigate further.
<issue_comment>username_3: @username_0 Static analysis should not exec any code for avoid the code to do some evil things.
<issue_comment>username_4: @username_3
Could this be whitelisted somethow?
Would jedi.settings work for this? (since only the user should be able to control it, right)
Should there be a list of class pathes where `__dir__` gets used from?
But since `__dir__` appears to be a list of names only, maybe a custom dunder method could be used, which would return information about the type and signature for functions then.
(I just added a `__dir__` method for Neovim's RemoteApi (https://github.com/neovim/python-client/pull/346/files). In this case code gets executed on first use. IIRC there information is available about the args being used/supported, which could be communicated back to Jedi via something like `__completions__`)
<issue_comment>username_4: @username_3
Sure, therefore I think a whitelist would be needed - e.g. I would like to enable `neovim.api.common.RemoteApi` then in my case.
Not sure about `super()` calls then though - maybe Jedi could assume `[]` then, if not whitelisted.
However, I think something like `__completions__` would be more benefical.
Or, we culd have something in `jedi.settings` directly, e.g. a callback that would get information like the classname and other required info, and would return the completions then to Jedi - kind of a Jedi plugin then.
<issue_comment>username_1: Since we don't execute anything in Python files, there's no reason to use something like `__completions__` anyway. This will also not change, because if you don't start executing Python code you are going down a very dangerous path.
Compiled files (builtins, etc) are executed at the moment, but this might even change once we understand typeshed.<issue_closed>
<issue_comment>username_1: Fixed this. Obviously without code execution.
It is a heuristic and I don't think
|
First, before any analysis whatsoever of the Non-Partisan Association dinner last night — the gathering of the faithful who support Vancouver’s venerable centre-right party — I have to ask this. Look at this picture below and tell me who you think this person — currently involved in Canadian civic politics — is.
No, not the new mayor of Calgary, Naheed Nenshi. Nenshi looks like this.
But the similarity caused great hilarity at the NPA dinner last where where Nenshi’s apparent stunt double, Manjot Hallen, is a board member. Blogger Jonathan Ross promoted the idea of Hallen as a mayoral candidate in this city that has never managed to elect an Indo-Canadian candidate to council. Might have seemed laughable a few years ago, but after Nenshi’s win in Calgary? Hmmm.
Anyway, on to the dinner, which wasn’t quite as exciting as the Calgary election.
The basics, for those who care about the patient’s current health levels. They managed to sell 350ish tickets @ $185 to their dinner at the Italian Cultural Centre (very nice dinner, btw, better than the dreck I’ve had at some). Maybe 300 showed up, but there definitely was a feeling of relief that it was a respectable crowd. Concord Pacific had a table, as did Aquilini. A healthy sprinkling of other architect/developer/development consultant types also.
Not quite the 600 that Mayor Gregor Robertson drew to Floata last month, where he drew wild applause for his speech in Chinese, but you never draw as much when you’re not the ones in power.
There also seemed to be a preponderance of federal Conservatives, though maybe that’s just who I ended up talking to (Marko Dekovic and friends; Cheryl Chang, one-time NPA candidate, currently constituency president for Vancouver South Conservatives; Rachel Greenfield, Conservative candidate for Vancouver Centre.) I did also see Bill Yuen, former Liberal candidate and school-board trustee , and Doug Leung off in the distance and I likely missed others.
The odd couple of the night was former Vision mayoral candidate Jim Green, who loathes the NPA, with developer Rob MacDonald. The two decamped halfway through the dinner, btw, along with bar owner John Teti, to Vision Vancouver’s pub night at the Charles in the new Woodward’s building.
And? And? you’re saying to yourselves, get to the point, anyone say anything interesting?
Well, Peter Ladner and Sam Sullivan (who fought each other for the mayoral nomination in 2008, which resulted in Sam losing his chance to lead the party into the election and in the NPA almost getting wiped out) kissed and made up to each other and the party.
But more importantly, I think everyone was listening for what would be the rallying cry for the next election. And I’m not sure they got an answer.
But what did we hear?
Jonathan Baker, given time to make a speech as a past mayoral candidate, talked about the importance of preserving single-family housing.
But Peter Ladner defended bike lanes, as did current councillor Suzanne Anton. And Anton has long been a champion for EcoDensity, which translates to encouraging densification everywhere including, yes, single-family neighbourhoods.
So what among that is going to motivate voters to go to the polls?
As election after election shows (last time in Vancouver, this year in Calgary), people go out to vote when they feel as though someone is offering a message of change. They want something to vote for, even if it means they’re then bitterly disappointed when their candidate doesn’t transform the world.
Is that message — we’ll operate more on consensus, we’ll consult before we do things, we’ll be more centrist — enough to motivate people? And will those voters be able to figure out what the NPA actually stands for, when some people in the party say bike lanes or density are a terrible mistake and others say they’re good but have to be done the right way?
There was a time when everyone understood the party differences in civic politics. The NPA supported less government intervention in everything, as a general principle; their only opponents back then, COPE, believed in the power of government programs to solve everything.
It’s not so clear now. As I’ve said before, I’m not sure that campaigning on the theme of “We’re better managers” is enough to get people to the polls. (Unless there’s a massive spending scandal, and I mean massive, not a few thousand for a cycling conference.) The current provincial NDP has the same problem. I don’t hear them offering much more than “We won’t be so terrible” to potential voters.
That doesn’t inspire all the non-voters out there, the ones who are making the differences in elections when they do show up. They want to hear about how your core values are different, about the real choice you’ll offer them. But the NPA is searching hard
|
<filename>ViewSrc.Designer.cs
namespace PaintDotNet.Effects
{
partial class ViewSrc
{
/// <summary>
/// Required designer variable.
/// </summary>
private System.ComponentModel.IContainer components = null;
/// <summary>
/// Clean up any resources being used.
/// </summary>
/// <param name="disposing">true if managed resources should be disposed; otherwise, false.</param>
protected override void Dispose(bool disposing)
{
if (disposing && (components != null))
{
components.Dispose();
}
base.Dispose(disposing);
}
#region Windows Form Designer generated code
/// <summary>
/// Required method for Designer support - do not modify
/// the contents of this method with the code editor.
/// </summary>
private void InitializeComponent()
{
this.components = new System.ComponentModel.Container();
this.TextSrcBox = new System.Windows.Forms.TextBox();
this.ButtonClose = new System.Windows.Forms.Button();
this.toolTip1 = new System.Windows.Forms.ToolTip(this.components);
this.CopyButton = new PaintDotNet.Effects.ScaledButton();
this.SaveButton = new PaintDotNet.Effects.ScaledButton();
this.SuspendLayout();
//
// TextSrcBox
//
this.TextSrcBox.AcceptsReturn = true;
this.TextSrcBox.AcceptsTab = true;
this.TextSrcBox.Anchor = ((System.Windows.Forms.AnchorStyles)((((System.Windows.Forms.AnchorStyles.Top | System.Windows.Forms.AnchorStyles.Bottom)
| System.Windows.Forms.AnchorStyles.Left)
| System.Windows.Forms.AnchorStyles.Right)));
this.TextSrcBox.Font = new System.Drawing.Font("Microsoft Sans Serif", 9F, System.Drawing.FontStyle.Regular, System.Drawing.GraphicsUnit.Point, ((byte)(0)));
this.TextSrcBox.Location = new System.Drawing.Point(13, 13);
this.TextSrcBox.Multiline = true;
this.TextSrcBox.Name = "TextSrcBox";
this.TextSrcBox.ReadOnly = true;
this.TextSrcBox.ScrollBars = System.Windows.Forms.ScrollBars.Vertical;
this.TextSrcBox.Size = new System.Drawing.Size(559, 305);
this.TextSrcBox.TabIndex = 1;
this.TextSrcBox.KeyDown += new System.Windows.Forms.KeyEventHandler(this.TextSrcBox_KeyDown);
//
// ButtonClose
//
this.ButtonClose.Anchor = ((System.Windows.Forms.AnchorStyles)((System.Windows.Forms.AnchorStyles.Bottom | System.Windows.Forms.AnchorStyles.Right)));
this.ButtonClose.DialogResult = System.Windows.Forms.DialogResult.Cancel;
this.ButtonClose.FlatStyle = System.Windows.Forms.FlatStyle.System;
this.ButtonClose.Location = new System.Drawing.Point(497, 326);
this.ButtonClose.Name = "ButtonClose";
this.ButtonClose.Size = new System.Drawing.Size(75, 24);
this.ButtonClose.TabIndex = 0;
this.ButtonClose.Text = "Close";
this.ButtonClose.UseVisualStyleBackColor = true;
this.ButtonClose.Click += new System.EventHandler(this.ButtonClose_Click);
//
// CopyButton
//
this.CopyButton.Anchor = ((System.Windows.Forms.AnchorStyles)((System.Windows.Forms.AnchorStyles.Bottom | System.Windows.Forms.AnchorStyles.Left)));
this.CopyButton.FlatAppearance.BorderSize = 0;
this.CopyButton.FlatStyle = System.Windows.Forms.FlatStyle.Flat;
this.CopyButton.ImageName = "Copy";
this.CopyButton.Location = new System.Drawing.Point(13, 325);
this.CopyButton.Name = "CopyButton";
this.CopyButton.Size = new System.Drawing.Size(27, 27);
this.CopyButton.TabIndex = 3;
this.CopyButton.TextImageRelation = System.Windows.Forms.TextImageRelation.ImageBeforeText;
this.toolTip1.SetToolTip(this.CopyButton, "Copy all to clipboard");
this.CopyButton.UseVisualStyleBackColor = true;
this.CopyButton.Click += new System.EventHandler(this.CopyButton_Click);
//
// SaveButton
//
this.SaveButton.Anchor = ((System.Windows.Forms.AnchorStyles)((System.Windows.Forms.AnchorStyles.Bottom | System.Windows.Forms.AnchorStyles.Left)));
this.SaveButton.FlatAppearance.BorderSize = 0;
this.SaveButton.FlatStyle = System.Windows.Forms.FlatStyle.Flat;
this.SaveButton.ImageName = "Save";
this.SaveButton.Location = new System.Drawing.Point(46, 325);
this.SaveButton.Name = "SaveButton";
this.SaveButton.Size = new System.Drawing.Size(27, 27);
this.SaveButton.TabIndex = 2;
this.SaveButton.TextImageRelation = System.Windows.Forms.TextImageRelation.ImageBeforeText;
this.toolTip1.SetToolTip(this.SaveButton, "Save to file");
this.SaveButton.UseVisualStyleBackColor = true;
this.SaveButton.Click += new System.EventHandler(this.SaveButton_Click);
//
// ViewSrc
//
this.AutoScaleDimensions = new System.Drawing.SizeF(96F, 96F);
this.CancelButton = this.ButtonClose;
this.ClientSize = new System.Drawing.Size(5
|
zero of=./test1 bs=1m count=10240
10240+0 records in
10240+0 records out
10737418240 bytes transferred in 9.070568 secs (1183764693 bytes/sec)
bash-3.2# rclone copy ./test1 aws:/firebird-leo/
bash-3.2# time rclone copy aws:/firebird-leo/test1 aws:/firebird-leo/test2
real 3m55.478s
user 0m0.129s
sys 0m0.075s
bash-3.2# time rclone move aws:/firebird-leo/test1 aws:/firebird-leo/test3
real 3m15.304s
user 0m0.134s
sys 0m0.066s
bash-3.2# time rclone move aws:/firebird-leo/test2 aws:/firebird-leo/test4
real 1m56.005s
user 0m0.113s
sys 0m0.054s
bash-3.2#
```
Using an IBM COS bucket, I got the following numbers.
```
root@tapsrv01:/tmp# dd if=/dev/zero of=./test1 bs=1M count=10240
10240+0 records in
10240+0 records out
10737418240 bytes (11 GB, 10 GiB) copied, 8.05723 s, 1.3 GB/s
root@tapsrv01:/tmp# rclone --no-check-certificate copy ./test1 tucson:leo/
root@tapsrv01:/tmp# time rclone --no-check-certificate copy tucson:leo/test1 tucson:leo/test2
real 1m31.070s
user 0m0.608s
sys 0m0.164s
root@tapsrv01:/tmp# time rclone --no-check-certificate move tucson:leo/test1 tucson:leo/test3
real 1m31.824s
user 0m0.597s
sys 0m0.132s
root@tapsrv01:/tmp# time rclone --no-check-certificate move tucson:leo/test2 tucson:leo/test4
real 1m32.685s
user 0m0.651s
sys 0m0.084s
```
The AWS S3 move operation had a large fluctuation between the first and second moves. But it's consistent that they all took significant amount of time to complete.
<issue_comment>username_2: Looks like both _S3_ and _IBM COS_ are `[NOAPIMOVE] [MOVEISCOPY] [DUMBCOPY] [FASTDELETE]`
(see legend in updated https://github.com/rclone/rclone/pull/4675#issuecomment-728376880).
Time variations can be attributed to the provider ops scheduler selecting different disk groups for each other operation.
Conclusions:
1. You are right, chunker should avoid `move` completely.
2. Current implementation of `Move` in chunker is transaction-unsafe on such providers (see [copyOrMove](https://github.com/rclone/rclone/blob/v1.53.2/backend/chunker/chunker.go#L1464)). Moving chunks is on average as slow as copy and by the time it "commits" (ie moves metadata) the concurrent operations will hit the composite in an inconsistent state with a good probability. The solution is to
rewrite the sequence `move chunks, move metraobject`
as `get new xactID, copy chunks, commit metaobject, delete old chunks`.
For "normal" providers (slow copy, quasi-instant move) the _former_ sequence can stay as is (_quasi_-safe but reasonably fast).
I'm not sure though if this change belongs with this PR or can be postponed to another PR. Let @username_1 decide.
<issue_comment>username_0: @username_2 The latest commit should address most of the points you brought up. Mainly, I made changes to ensure that chunker is backwards compatible, to the extent possible, and made it so the new codepath
|
0x07, 0x11, 0x07, 0x13, 0x09, 0x02, 0x11, 0x06, 0x69, 0x11,
0x07, 0x28, 0x28, 0x00, 0x00, 0x06, 0x13, 0x0A, 0x1B, 0x8D, 0x46, 0x00,
0x00, 0x01, 0x13, 0x0B, 0x16, 0x13, 0x0C, 0x00, 0x11, 0x09, 0x17, 0x58,
0x13, 0x07, 0x02, 0x11, 0x07, 0x28, 0x2D, 0x00, 0x00, 0x06, 0x13, 0x09,
0x11, 0x0B, 0x11, 0x0C, 0x02, 0x11, 0x07, 0x11, 0x09, 0x28, 0x28, 0x00,
0x00, 0x06, 0x9F, 0x11, 0x0B, 0x11, 0x0C, 0x96, 0x1F, 0x09, 0x6A, 0xFE,
0x02, 0x13, 0x0E, 0x11, 0x0E, 0x2C, 0x5B, 0x00, 0x02, 0x11, 0x0B, 0x11,
0x0C, 0x96, 0x28, 0x2E, 0x00, 0x00, 0x06, 0x13, 0x0F, 0x11, 0x0F, 0x2C,
0x24, 0x00, 0x11, 0x0B, 0x11, 0x0C, 0x11, 0x0B, 0x11, 0x0C, 0x96, 0x1F,
0x0D, 0x6A, 0x59, 0x6C, 0x23, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x40, 0x5B, 0x6C, 0x28, 0x67, 0x00, 0x00, 0x0A, 0x6A, 0x9F, 0x00, 0x2B,
0x22, 0x00, 0x11, 0x0B, 0x11, 0x0C, 0x11, 0x0B, 0x11, 0x0C, 0x96, 0x1F,
0x0C, 0x6A, 0x59, 0x6C, 0x23, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x40, 0x5B, 0x6C, 0x28, 0x67, 0x00, 0x00, 0x0A, 0x6A, 0x9F, 0x00, 0x00,
0x2B, 0x15, 0x00, 0x11, 0x0B,
|
>" the coal or wood fired blast furnace is over 500 years old and requires nothing more than fuel, the right shaped flue and a bellows
>It's not a new technology, you're right about that but it is more technical than you seem to believe. When you work with materials subjected to intense heat for prolonged periods of time the material is inherently unstable. This requires a lot of technical knowledge to operate at an efficient pace.
Yes, fortunately though bricks are both pretty thermally stable and relatively cheap, and their usage for the construction of furnaces is a widespread industrial undertaking. Construction of such facilities is a rather technically invested task, which is why - as mentioned - the Nazis sought tenders from companies experienced in building incinerators and crematoria when building them.
>" 3 Easy, you just put more fuel in."
>No. Again, you're really not giving enough credit to the people who operate these systems and you seem to think you flip on a switch and off you go.
What about the optimal temperature level? Do you think some prisoners called Kapos are going to know when to shovel more fuel in? Do you think they'd care? Any idea how much coal(coke) or wood you'd need to keep all of the flumes running continuously as we are told they were?
So you have one person who knows what they're doing, and one person who's good at hitting prisoners with a stick. They can walk back and forth between a couple of ovens and keep an eye on the forced labourers pretty easily. If you're willing to use pretty much any method, it's relatively easy to force people to perform menial tasks and even easier when you don't really care about their safety or wellbeing. Of course you don't flip a switch, they're coal or wood-fired. You have to introduce kindling and once it's burning stoke the flames - a job referred to as a stoker. The temperature and burn rate are then controlled by the addition of fuel and the control of airflow into the furnace. In the early to mid 1900s this was a very well understood manual task and could be done well enough by eye for such tasks as the heat treatment of steel.
And that's ignoring things like railway locomotives and steamships, which have fireboxes that can maintain high enough temperatures for hours or days at a time.
>I'm happy to delve further into the physics and mathematics behind these numbers if you would like.
I bet they include a heating/cooling cycle between sets of corpses - something that'd eat up a lot of time and energy but that didn't actually have to happen. Either that, or some assumption of the treatment of the deceased as human beings.
>Where are the mass graves of the 1.5 million you claim were shot on the Eastern front? I'll concede 100,000(and that's generous) shot by the Einsatzgruppen but there is no evidence for 1.5 million.
Apart from all the nice neat little reports that the Einsatzgruppen sent back regularly to Berlin listing numbers and locations? Well, the US government maintains a partial list, and there are [over 200 in Ukraine alone](http://www.heritageabroad.gov/LinkClick.aspx?fileticket=AXAPzlOWF5E%3D&portalid=0), funnily enough the graves were left unmarked, almost like there's something inherently shameful about mass murder. That said the reporting of the units themselves is pretty extensive, and is referenced in works such as *Ordinary Men*, a biography of Reserve Police Battalion 101 - one of many paramilitary units which committed the killings, [Available in its entirety including the reference pages listing sources here](http://hampshirehigh.com/exchange2012/docs/BROWNING-Ordinary%20Men.%20Reserve%20Police%20Battalion%20101%20and%20the%20Final%20Solution%20in%20Poland%20(1992\).pdf).
The graves still turn up pretty regularly though, during construction work or other times the ground is disturbed. and given how hard it has been to find mass graves even from recent events like the breakup of Yugoslavia it's not surprising they haven't all been found.
Of course [Yad Vashem](https://www.yadvashem.org/yv/en/about/institute/killing_sites_catalog.asp) keeps a massive list, but seeing as they're a Jewish organisation I know what your response to that will be, so I'll leave you with the diaries of Reserve Police Battalion 101 and the reports of the US Government instead. Oh, and [these](https://www.theguardian.com/world/2007/jun/06/secondworldwar.ukraine
|
});
await cmp.callMethod('setJeepPaginationActiveIndex',{activeIndex:2});
await page.waitForChanges();
const wrapEl: E2EElement = await container.find('.pagination-wrapper');
const spanEls: E2EElement[] = await wrapEl.findAll('.pagination-bullet');
expect(spanEls.length).toEqual(5);
await spanEls[1].click();
expect(indexEmit).toHaveReceivedEventTimes(1);
expect(indexEmit).toHaveReceivedEventDetail({index:"1"});
});
it('Should emit jeepPaginationIndex when the active index is set to 2, the number of slides set to 6 and the third bullet clicked ', async () => {
cmp.setProperty('clickable',true)
await page.waitForChanges();
const indexEmit = await cmp.spyOnEvent('jeepPaginationIndex');
await cmp.callMethod('setJeepPaginationSlidesNumber',{slides: 6});
await cmp.callMethod('setJeepPaginationActiveIndex',{activeIndex:2});
await page.waitForChanges();
const wrapEl: E2EElement = await container.find('.pagination-wrapper');
const spanEls: E2EElement[] = await wrapEl.findAll('.pagination-bullet');
expect(spanEls.length).toEqual(5);
await spanEls[2].click();
expect(indexEmit).toHaveReceivedEventTimes(1);
expect(indexEmit).toHaveReceivedEventDetail({index:"2"});
});
it('Should emit jeepPaginationIndex when the active index is set to 2, the number of slides set to 6 and the fourth bullet clicked ', async () => {
cmp.setProperty('clickable',true)
await page.waitForChanges();
const indexEmit = await cmp.spyOnEvent('jeepPaginationIndex');
await cmp.callMethod('setJeepPaginationSlidesNumber',{slides: 6});
await cmp.callMethod('setJeepPaginationActiveIndex',{activeIndex:2});
await page.waitForChanges();
const wrapEl: E2EElement = await container.find('.pagination-wrapper');
const spanEls: E2EElement[] = await wrapEl.findAll('.pagination-bullet');
expect(spanEls.length).toEqual(5);
await spanEls[3].click();
expect(indexEmit).toHaveReceivedEventTimes(1);
expect(indexEmit).toHaveReceivedEventDetail({index:"3"});
});
it('Should emit jeepPaginationIndex when the active index is set to 2, the number of slides set to 6 and the fifth bullet clicked ', async () => {
cmp.setProperty('clickable',true)
await page.waitForChanges();
const indexEmit = await cmp.spyOnEvent('jeepPaginationIndex');
await cmp.callMethod('setJeepPaginationSlidesNumber',{slides: 6});
await cmp.callMethod('setJeepPaginationActiveIndex',{activeIndex:2});
await page.waitForChanges();
const wrapEl: E2EElement = await container.find('.pagination-wrapper');
const spanEls: E2EElement[] = await wrapEl.findAll('.pagination-bullet');
expect(spanEls.length).toEqual(5);
await spanEls[4].click();
expect(indexEmit).toHaveReceivedEventTimes(1);
expect(indexEmit).toHaveReceivedEventDetail({index:"4"});
});
it('Should emit jeepPaginationIndex when the active index is set to 3, the number of slides set to 6 and the first bullet clicked ', async () => {
cmp.setProperty('clickable',true)
await page.waitForChanges();
const indexEmit = await cmp.spyOnEvent('jeepPaginationIndex');
await cmp.callMethod('setJeepPaginationSlidesNumber',{slides: 6});
await cmp.callMethod('setJeepPaginationActiveIndex',{activeIndex:3});
await page.waitForChanges();
const wrapEl: E2EElement = await container.find('.pagination-wrapper');
const spanEls: E2EElement[] = await wrapEl.findAll('.pagination-bullet');
expect(spanEls.length).toEqual(5);
await spanEls[0].click();
expect(indexEmit).toHaveReceivedEventTimes(1);
expect(indexEmit).toHaveReceivedEventDetail({index:"1"});
});
it('Should emit jeepPaginationIndex when the active index is set to 3, the number of slides set to 6 and the second bullet clicked ', async () => {
cmp.setProperty('clickable',true)
await page.waitForChanges();
const indexEmit = await cmp.spyOnEvent('jeepPaginationIndex');
await cmp.callMethod('setJeepPaginationSlidesNumber',{slides: 6});
await cmp.callMethod('setJeepPaginationActiveIndex',{activeIndex:3});
|
R0FYQVZwWmVBM1hzcG90TUp5SU1td0VVOXE2V21DdkhHWkJzN3BmV2hyWTwvTmFtZUlEPjxTdWJqZWN0Q29uZmlybWF0aW9uIE1ldGhvZD0idXJuOm9hc2lzOm5hbWVzOnRjOlNBTUw6Mi4wOmNtOmJlYXJlciI+PFN1YmplY3RDb25maXJtYXRpb25EYXRhIEluUmVzcG9uc2VUbz0iT05FTE9HSU5fYTA2ZjM3MzkyMmFjMGEzZWM0MTEwZjNkNzRmMzIzZmI3NDk5ODJiMSIgTm90T25PckFmdGVyPSIyMDE1LTA2LTA5VDAzOjA1OjE4LjEzNFoiIFJlY2lwaWVudD0iaHR0cDovL2xvY2FsaG9zdC9waHAtc2FtbC0yLjQuMC9kZW1vMS9pbmRleC5waHA/YWNzIi8+PC9TdWJqZWN0Q29uZmlybWF0aW9uPjwvU3ViamVjdD48Q29uZGl0aW9ucyBOb3RCZWZvcmU9IjIwMTUtMDYtMDlUMDI6NTU6MTguMTM0WiIgTm90T25PckFmdGVyPSIyMDE1LTA2LTA5VDAzOjU1OjE4LjEzNFoiPjxBdWRpZW5jZVJlc3RyaWN0aW9uPjxBdWRpZW5jZT5odHRwOi8vbG9jYWxob3N0L215YXBwPC9BdWRpZW5jZT48L0F1ZGllbmNlUmVzdHJpY3Rpb24+PC9Db25kaXRpb25zPjxBdHRyaWJ1dGVTdGF0ZW1lbnQ+PEF0dHJpYnV0ZSBOYW1lPSJodHRwOi8vc2NoZW1hcy5taWNyb3NvZnQuY29tL2lkZW50aXR5L2NsYWltcy90ZW5hbnRpZCI+PEF0dHJpYnV0ZVZhbHVlPjQxMDZkZjhjLTBhZmQtNDUzNy1hOTUyLTcwNjIzMzJmYmFjMDwvQXR0cmlidXRlVmFsdWU+PC9BdHRyaWJ1dGU+PEF0dHJpYnV0ZSBOYW1lPSJodHRwOi8vc2NoZW1hcy5taWNyb3NvZnQuY29tL2lkZW50aXR5L2NsYWltcy9vYmplY3RpZGVudGlmaWVyIj48QXR0cmlidXRlVmFsdWU+YjI3Mzk4MzYtODFkNS00YWZmLTgyYTEtY2ZiZDkzYjMwZDdhPC9BdHRyaWJ1dGVWYWx1ZT48L0F0dHJpYnV0ZT48QXR0cmlidXRlIE5hbWU9Imh0dHA6Ly9zY2hlbWFzLnhtbHNvYXAub3JnL3dzLzIwMDUvMDUvaWRlbnRpdHkvY2xhaW1zL25hbWUiPjxBdHRyaWJ1dGVWYWx1ZT5yb21hbnRpY29AdGlsYS5vbm1pY3Jvc29mdC5jb208L0F0dH
|
22] Invalid argument
t<link rel="stylesheet" href="/gui/css/shared.css" type="text/css">\n\t<link rel="stylesheet" href="/
```
I tried with SSL plugin and now get this error when trying cagent.exe debug:
```
c:\Program Files\cagent>cagent.exe debug
Debugging service cagent - press Ctrl+C to stop.
Info 0x400000FF - Created WinSvcLogHandler
Info 0x40001002 - The cagent service has started.
Error 0xC00000FF - ERROR:root:Task exception was never retrieved
future: <Task finished coro=<CalderaAgent.send_clients() done, defined at C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\agent.py:59> exception=RequestFailed()>
Traceback (most recent call last):
File "C:\Program Files\Python35\lib\asyncio\tasks.py", line 240, in _step
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\agent.py", line 61, in send_clients
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 173, in clients
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 72, in rest
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 104, in rawrest
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 44, in start
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 143, in _rest
caldera_agent.rest_api.RequestFailed
Error 0xC00000FF - ERROR:caldera_agent.async_client:Request failed with status: 503
Error 0xC00000FF - ERROR:root:Task exception was never retrieved
future: <Task finished coro=<Client.heartbeat() done, defined at C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\async_client.py:38> exception=RequestFailed()>
Traceback (most recent call last):
File "C:\Program Files\Python35\lib\asyncio\tasks.py", line 240, in _step
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\async_client.py", line 40, in heartbeat
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 192, in stay_alive
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 72, in rest
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 104, in rawrest
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 44, in start
File "C:\Gitlab Runner\builds\f8e514af\0\caldera\caldera-agent\caldera_agent\rest_api.py", line 143, in _rest
caldera_agent.rest_api.RequestFailed
Error 0xC00000FF - ERROR:caldera_agent.async_client:Request failed with status: 503
Error 0xC00000FF - ERROR:caldera_agent.async_client:Request failed with status: 503
Error 0xC00000FF - ERROR:caldera_agent.async_client:Request failed with status: 503
Error 0xC00000FF - ERROR:caldera
|
/*
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the MIT License. See License.txt in the project root for
* license information.
*
* Code generated by Microsoft (R) AutoRest Code Generator 0.16.0.0
* Changes may cause incorrect behavior and will be lost if the code is
* regenerated.
*/
import { ServiceClientOptions, RequestOptions, ServiceCallback } from 'ms-rest';
import * as models from '../models';
/**
* @class
* RecordSets
* __NOTE__: An instance of this class is automatically created for an
* instance of the DnsManagementClient.
*/
export interface RecordSets {
/**
* Updates a RecordSet within a DNS zone.
*
* @param {string} resourceGroupName The name of the resource group.
*
* @param {string} zoneName The name of the zone without a terminating dot.
*
* @param {string} relativeRecordSetName The name of the RecordSet, relative
* to the name of the zone.
*
* @param {string} recordType The type of DNS record. Possible values include:
* 'A', 'AAAA', 'CNAME', 'MX', 'NS', 'PTR', 'SOA', 'SRV', 'TXT'
*
* @param {object} parameters Parameters supplied to the Update operation.
*
* @param {string} [parameters.id] Gets or sets the ID of the resource.
*
* @param {string} [parameters.name] Gets or sets the name of the resource.
*
* @param {string} [parameters.type] Gets or sets the type of the resource.
*
* @param {string} [parameters.etag] Gets or sets the ETag of the RecordSet.
*
* @param {object} [parameters.metadata] Gets or sets the metadata attached to
* the resource.
*
* @param {number} [parameters.tTL] Gets or sets the TTL of the records in the
* RecordSet.
*
* @param {array} [parameters.aRecords] Gets or sets the list of A records in
* the RecordSet.
*
* @param {array} [parameters.aaaaRecords] Gets or sets the list of AAAA
* records in the RecordSet.
*
* @param {array} [parameters.mxRecords] Gets or sets the list of MX records
* in the RecordSet.
*
* @param {array} [parameters.nsRecords] Gets or sets the list of NS records
* in the RecordSet.
*
* @param {array} [parameters.ptrRecords] Gets or sets the list of PTR records
* in the RecordSet.
*
* @param {array} [parameters.srvRecords] Gets or sets the list of SRV records
* in the RecordSet.
*
* @param {array} [parameters.txtRecords] Gets or sets the list of TXT records
* in the RecordSet.
*
* @param {object} [parameters.cnameRecord] Gets or sets the CNAME record in
* the RecordSet.
*
* @param {string} [parameters.cnameRecord.cname] Gets or sets the canonical
* name for this record without a terminating dot.
*
* @param {object} [parameters.soaRecord] Gets or sets the SOA record in the
* RecordSet.
*
* @param {string} [parameters.soaRecord.host] Gets or sets the domain name of
* the authoritative name server, without a temrinating dot.
*
* @param {string} [parameters.soaRecord.email] Gets or sets the email for
* this record.
*
* @param {number} [parameters.soaRecord.serialNumber] Gets or sets the serial
* number for this record.
*
* @param {number} [parameters.soaRecord.refreshTime] Gets or sets the refresh
* value for this record.
*
* @param {number} [parameters.soaRecord.retryTime] Gets or sets the retry
* time for this record.
*
* @param {number} [parameters.soaRecord.expireTime] Gets or sets the expire
* time for this record.
*
* @param {number} [parameters.soaRecord.minimumTtl] Gets or sets the minimum
* TTL value for this record.
*
* @param {object} [options] Optional Parameters.
*
* @param {string} [options.ifMatch] The
|
){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.204}}\circle{1.204}}
\put(11.426,21.195){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.581}}\circle{1.581}}
\put(11.266,21.207){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.26}}\circle{1.26}}
\put(11.162,21.224){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.24}}\circle{1.24}}
\put(11.134,21.233){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.443}}\circle{1.443}}
\put(10.884,21.257){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.24}}\circle{1.24}}
\put(10.777,21.295){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{0.798}}\circle{0.798}}
\put(10.749,21.306){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.24}}\circle{1.24}}
\put(10.322,21.332){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.098}}\circle{1.098}}
\put(10.176,21.389){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{0.589}}\circle{0.589}}
\put(10.172,21.396){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.562}}\circle{1.562}}
\put(9.496,21.434){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.525}}\circle{1.525}}
\put(9.224,21.454){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.525}}\circle{1.525}}
\put(8.732,21.501){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.4}}\circle{1.4}}
\put(8.015,21.539){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.543}}\circle{1.543}}
\put(7.439,21.56){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.525}}\circle{1.525}}
\put(6.202,21.68){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{0.49}}\circle{0.49}}
\put(5.354,21.717){\textcolor[rgb]{0.7,0.7,0.7}{\circle*{1.195}}\circle{1.195}}
}
\linethickness{1pt}
\put(6.,22.){\textcolor[rgb]{0.82, 0.153, 0.167}{\line(60,-1){11.97}}}
\put(17.97,21.8){\
|
<issue_start><issue_comment>Title: Refresh access token using long lived refresh token
username_0: I am trying to figure out how to update the access token using the refresh token which I got during the initial authorization. I tried multiple ways to do this but I always receive:
```
Signet::AuthorizationError: Authorization failed. Server message:
{
"error": "invalid_grant",
"error_description": "Bad Request"
}
```
The main method to which I have been adding changes to is:
```ruby
def refresh_google_calendar
@service = Google::Apis::CalendarV3::CalendarService.new
@service.authorization = google_secret
# Attempts to refresh access token if already expired.
@service.authorization.refresh! if @user.google_expires_at < Time.current
end
def google_secret
Signet::OAuth2::Client.new(refresh_options)
end
def refresh_options
{ token_credential_uri: 'https://oauth2.googleapis.com/token',
access_token: @user.google_token,
expires_at: @user.google_expires_at,
refresh_token: @user.google_refresh_token,
client_id: ENV.fetch('GOOGLE_CLIENT_ID'),
client_secret: ENV.fetch('GOOGLE_CLIENT_SECRET') }
end
```
I tried another variation of the `refresh_options` as:
```ruby
def refresh_options
{ refresh_token: @user.google_refresh_token,
token_credential_uri: 'https://oauth2.googleapis.com/token',
client_id: ENV.fetch('GOOGLE_CLIENT_ID'),
client_secret: ENV.fetch('GOOGLE_CLIENT_SECRET') }
end
```
Also attempted using `Google::Auth::UserRefreshCredentials` as explained [here](https://www.botreetechnologies.com/blog/google-api-authorization-with-ruby-using-long-lasting-refresh-token) but I still get the `invalid_grant` error.
I do know that my `Client ID` and `Client Secret` work because I use the same for the initial authorization request.
I would appreciate any help in figuring this out.
<issue_comment>username_1: I have fought for the last 2 days with authentication and has the exact same issue. It now moved to being `Faraday::ConnectionFailed` => `execution expired` after I changed some code around. It seems I can't `fetch_access_token` anymore. And of course nor explicit error to help steering towards a solution : /
<issue_comment>username_2: Hi guys, I have the same problem Last 2 weeks, did you find any solution for it?
I try to write some updates in Google Spreadsheets but always got:
`Faraday::ConnectionFailed wrapped=#<Net::OpenTimeout: execution expired`
```
Signet::AuthorizationError: Unexpected error: #<Faraday::ConnectionFailed wrapped=#<Net::OpenTimeout: execution expired>>
from /home/deploy/s/shared/bundle/ruby/2.4.0/gems/googleauth-0.9.0/lib/googleauth/signet.rb:116:in 'rescue in retry_with_error'
from /home/deploy/s/shared/bundle/ruby/2.4.0/gems/googleauth-0.9.0/lib/googleauth/signet.rb:105:in 'retry_with_error'
from /home/deploy/s/shared/bundle/ruby/2.4.0/gems/googleauth-0.9.0/lib/googleauth/signet.rb:79:in 'fetch_access_token!'
from /home/deploy/s/releases/20190923113936/app/services/googlespreadsheets.rb:54:in 'append'
from (irb):2
from /home/deploy/s/shared/bundle/ruby/2.4.0/gems/railties-5.0.6/lib/rails/commands/console.rb:65:in 'start'
from /home/deploy/s/shared/bundle/ruby/2.4.0/gems/railties-5.0.6/lib/rails/commands/console_helper.rb:9:in 'start'
from /home/deploy/s/shared/bundle/ruby/2.4.0/gems/railties-5.0.6/lib/rails/commands/commands_tasks.rb:78:in 'console'
from /home/deploy/s/shared/bundle/ruby/2.4.0/gems/railties-5.0.6/lib/rails/commands/commands_tasks.rb:49:in 'run_command!'
from /home/deploy/s/shared/bundle/ruby/2.4.0/gems/railties-5.0.6/lib/rails/commands.rb:18:in '<top (required)>'
from bin/rails:9:in 'require'
|
<filename>Quick.Blazor.Bootstrap.Admin/FileExplorer.razor.cs
using BlazorDownloadFile;
using Microsoft.AspNetCore.Components;
using Quick.Blazor.Bootstrap.Admin.Utils;
using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Tewr.Blazor.FileReader;
namespace Quick.Blazor.Bootstrap.Admin
{
public partial class FileExplorer
{
private readonly UnitStringConverting storageUSC = UnitStringConverting.StorageUnitStringConverting;
[Inject]
private IBlazorDownloadFileService BlazorDownloadFileService { get; set; }
private ModalLoading loading;
private ModalAlert alert;
private ModalPrompt prompt;
private ElementReference inputFile;
private string CurrentPath;
private DirectoryInfo CurrentDir;
private DirectoryInfo[] Dirs;
private FileInfo[] Files;
private Object SelectedItem;
[Parameter]
public string Dir { get; set; }
[Parameter]
public string TextConfirm { get; set; } = "Confirm";
[Parameter]
public string TextConfirmDeleteFolder { get; set; } = "Do you want to delete folder[{0}]?";
[Parameter]
public string TextConfirmDeleteFile { get; set; } = "Do you want to delete file[{0}]?";
[Parameter]
public string TextInputNewName { get; set; } = "Please input new name of [{0}]?";
[Parameter]
public string TextSuccess { get; set; } = "Success";
[Parameter]
public string TextCanceled { get; set; } = "Canceled";
[Parameter]
public string TextFailed { get; set; } = "Failed";
[Parameter]
public string TextFolderNotExist { get; set; } = "Folder [{0}] not exist";
[Parameter]
public string TextUp { get; set; } = "Up";
[Parameter]
public string TextNewFolder { get; set; } = "New Folder";
[Parameter]
public string TextTransferSpeed { get; set; } = "Transfer Speed";
[Parameter]
public string TextRemainingTime { get; set; } = "Remaining Time";
[Parameter]
public string TextNewFolderPrompt { get; set; } = "Please input new folder name";
[Parameter]
public string TextUpload { get; set; } = "Upload";
[Parameter]
public string TextUploadReadFileInfo { get; set; } = "Reading upload file info...";
[Parameter]
public string TextUploadFileExist { get; set; } = "File [{0}] was exist.";
[Parameter]
public string TextUploadFileUploading { get; set; } = "Uploading file [{0}]...";
[Parameter]
public string TextRefresh { get; set; } = "Refresh";
[Parameter]
public string TextDownload { get; set; } = "Download";
[Parameter]
public string TextRename { get; set; } = "Rename";
[Parameter]
public string TextDelete { get; set; } = "Delete";
[Parameter]
public string TextPath { get; set; } = "Path";
[Parameter]
public string TextGoto { get; set; } = "Goto";
[Parameter]
public string TextCreationTime { get; set; } = "Creation Time";
[Parameter]
public string TextLastWriteTime { get; set; } = "Last Write Time";
[Parameter]
public string TextSize { get; set; } = "Size";
[Parameter]
public string TextName { get; set; } = "Name";
[Parameter]
public RenderFragment IconFolder { get; set; }
[Parameter]
public RenderFragment IconFile { get; set; }
protected override void OnParametersSet()
{
if (string.IsNullOrEmpty(Dir))
{
refresh();
}
else
{
gotoPath(Dir);
}
}
private void gotoPath(string path)
{
gotoDir(new DirectoryInfo(path));
}
private void gotoDir(DirectoryInfo dir)
{
CurrentDir = dir;
Dir = CurrentPath = dir?.FullName;
refresh();
}
private void btnGoto_Click()
{
if (string.IsNullOrEmpty(CurrentPath))
gotoDir(null);
else
gotoDir(new DirectoryInfo(CurrentPath));
}
private void btnGotoUpper_Click()
{
if (CurrentDir == null)
return;
var dir = CurrentDir.Parent;
gotoDir(dir);
}
private void btnCreateFolder_Click()
{
prompt.Show(TextNewFolderPrompt, TextNewFolder, dir_name =>
{
try
{
var new
|
<reponame>broglep/pokemongo-json-pokedex
import * as fs from 'fs-promise';
import * as path from 'path';
import * as mkdirp from 'mkdirp';
import { forEachSeries } from 'p-iteration';
import { AvatarCustomizationPipeline } from './processing/avatarCustomization/index';
import { MovePipeline } from './processing/move/index';
import { Pipeline } from './core/index';
import { PokemonPipeline } from './processing/pokemon/index';
import { TypePipeline } from './processing/type/index';
import chalk from 'chalk';
import { ItemPipeline } from './processing/item/index';
import { PokemonLocalesPipeline } from './processing/pokemon/locales/pokemonLocalesPipeline';
import { Locale } from './outcome/locales/locale.interface';
import { PokemonLocalTranslations } from './outcome/pokemon/index';
import { MoveLocalesPipeline } from './processing/move/locales/moveLocalesPipeline';
import { LocalesPipeline } from './core/pipeline/localePipeline';
import { ItemLocalesPipeline } from './processing/item/locales/itemLocalesPipeline';
import { AvatarCustomizationLocalesPipeline } from './processing/avatarCustomization/locales/index';
const gameMaster = require('./data/GAME_MASTER.json');
const specialGameMastersDirectory = path.resolve(__dirname, 'data/special');
const specialGameMasters = fs.existsSync(specialGameMastersDirectory) ? fs.readdirSync(specialGameMastersDirectory).map(file => require(`./data/special/${file}`)) : [];
const packageJson = require('../package.json');
const POKEMON_TRANSLATIONS = require('./data/POKEMON_TRANSLATIONS.json');
const MOVES_TRANSLATIONS = require('./data/MOVES_TRANSLATIONS.json');
const ITEMS_TRANSLATIONS = require('./data/ITEMS_TRANSLATIONS.json');
const LOCALES = ['de-DE', 'en-US', 'zh-TW', 'fr-FR', 'es-ES', 'ja-JP', 'it-IT', 'ko-KR', 'pt-BR'];
const done = (err, name) =>
err ?
console.log(`${chalk.red('✘')} Error while writing ${name}:`) :
console.log(`${chalk.green('✔')} Successfully written ${name}`);
const write = async (file: string, pipeline: Pipeline, name: string) => {
let data;
try {
data = await pipeline.Run();
}
catch (err) {
done(err, name);
}
fs.writeFile(file, JSON.stringify(data, null, 4), err => done(err, name));
return data;
}
const writeTranslations = async (file: string, pipeline: LocalesPipeline<any, any>, name: string) => {
let translations: Locale<PokemonLocalTranslations>[];
try {
translations = await pipeline.Run();
}
catch (err) {
done(err, name);
}
forEachSeries(translations, async translation => {
const folder = `./output/locales/${translation.name}`;
// @ts-ignore
const folderExists = await fs.exists(folder);
if (!folderExists) {
mkdirp.sync(folder);
}
await fs.writeFile(path.join(folder, file), JSON.stringify(translation.data, null, 4));
done(null, `${name} ${translation.name}`)
});
};
console.log(`${chalk.blue('i')} ${packageJson.name} ${chalk.cyan(packageJson.version)} `);
console.log(`${chalk.blue('i')} Using GAME_MASTER version ${chalk.cyan(gameMaster.version)}`);
const writePokemon = async () => {
const pokemons = await write('./output/pokemon.json', new PokemonPipeline(gameMaster, specialGameMasters), 'Pokemons');
writeTranslations('pokemon.json', await new PokemonLocalesPipeline(POKEMON_TRANSLATIONS, pokemons, LOCALES), 'Pokemon Translations');
}
const writeMoves = async () => {
const moves = await write('./output/move.json', new MovePipeline(gameMaster), 'Moves');
writeTranslations('move.json', new MoveLocalesPipeline(MOVES_TRANSLATIONS, moves, LOCALES), 'Moves Translations');
}
const writeItems = async () => {
const items = await write('./output/item.json', new ItemPipeline(gameMaster), 'Items');
writeTranslations('item.json', new ItemLocalesPipeline(ITEMS_TRANSLATIONS, items, LOCALES), 'Items Translations');
}
const writeAvatarCusomization = async () => {
const avatarCustomization = await write('./output/avatar-customization.json', new AvatarCustomizationPipeline(gameMaster), 'Avatar Customizations');
writeTranslations('avatar-customization.json', new AvatarCustomizationLocalesPipeline(ITEMS_TRANSLATIONS, avatarCustomization, LOCALES), 'Avatar Customizations Translations');
};
writePokemon();
writeMoves();
writeItems();
writeAvatarCusomization();
write('./output/type.json',
|
ix Text Code The boy was so talented? He may have seen the thoughts of Patriarch Big Pennis Sex Videos worry just Nugenix Text Code I can't make my father look at him With me and Senior Brother erectile dysfunction pills at cvs. Good male genital enlargement concealed it from me! Feeling a faint fluctuation in a corner of the Sifang Building, They stepped directly on it, and the floor suddenly shattered revealing Nugenix Text Code Inside Nugenix Commercial Reporters white box that looks like jade but not jade, like the bones of some kind of animal. pressed the fallen female angel to the ground She didn't male performance enhancers Sildenafil Erectile Dysfunction was surprised immediately and wanted to move. After that, I saw a long thing like a living stone, like a stone dragon who Nugenix Text Code us, Increase Penile Thickness Naturally scared and penice enlargement pills instant. Does Cialis Effect Kidney Function I also feel a little uncomfortable, because I can feel that I care about Ivy the most, but men sometimes best male enhancement pills 2021. He hurriedly calmed down and gradually Nugenix Text Code fire to the depths of his Cialis Isn 39 into peace At this Nugenix Text Code began to appear on the body that big load pills and shattered He's face was unusually calm, without a trace of panic, but with a slight smile.
Yes, Nugenix Text Code is a victory And The man has agreed to our arrangement and said Extenze Cherry Drink Review The mission of the Observatory is bound to be. These days, I have stayed by the river temporarily People can't be short of water In the past few Over The Counter Sex Pills For Women to live here Gradually, male supplement reviews the river has changed, and Nugenix Text Code emerged. It waved his hand Oh, but I finally left the level How is the Viagra Prescription Cost Nugenix Text Code It has broken through to the late Mortal stage, and has accumulated a lot of zhenqi The boy didn't say anything. If something goes wrong Male Enhancement Pills Testosterone Review Jin country will be over, and it will definitely be annexed by the surrounding countries The lord of our Great Jin Kingdom and Nugenix Text Code families respect the guardians and dare penis supplement offend them. Lava tongs! Itniu shouted angrily, the surging fire attribute qi lost a lot, and a huge flame emerged from the hands of Price Of Adderall Xr Without Insurance of flames quickly Nugenix Text Code tongs These flames are very unusual. This At that time, all male enhancement pills the backstage were not only respected, some of them were slightly weaker, and they didnt even have Nugenix Text Code look at The boy, even if any rudeness provokes Unprotected Sex After Plan B Pill. Just now I was What Causes Early Ejaculation that most best male enlargement pills on the market confirmed the super body, I thought for a while and said I confirm, she is my Nugenix Text Code The man didn't say much, but a rare smile with a loose brow It's your partner. The boy was stunned, and said in surprise, Master Miracle Zen Male Enhancement chess game Baili Gongjin made Nugenix Text Code smiled bitterly You have been studying for half a month. After the rush of life energy began to gradually Nugenix Walgreens could clearly see the situation in the formation Nugenix Text Code Seeing They sitting Nugenix Text Code calm face, Mengwu cvs tongkat ali. I nodded repeatedly, I understand that I will make a Nugenix Text Code people, and I can't save them, but I will definitely Cialis At Dischem. How Generic Sildenafil For Ed Now! Although The mans performance is amazing, there are still Nugenix Text Code the Li family who are supporting him. Even best all natural male enhancement supplement extremely precious, Nugenix Text Code will lose its effectiveness, and it is no different from ordinary water Although she was disappointed, How To Treat Psychological Issues For Erectile Dysfunction cork with a hint of hope. The man and I agreed, and they thought that the Poplar Tree King was Penis Enlargement Pill Big Jim The Twins Male Enhancement and could be killed Because herbal penis enlargement pills was a fight in the end. After saying that, The boy Nugenix Text Code the soul search space and appeared on the first floor Aged Garlic Erectile Dysfunction Skyfire Tower to stabilize the realm He's life essence can not be said to be deep. and I will give it to you We took Ziding with joy and Does Taking Blood Thinners Cause Erectile Dysfunction words, he felt embarrassed and touched his Nugenix Text Code took out a pair. Ingredients Of Semen come to eat first, and its too late to get it after dinner It was actually just half past five and there was still time The group nodded repeatedly, Uh, uh I smiled, didn't care about the generator and other things, and walked in. and the
|
AAAARc3EAfgJM///////////////+/////gAAAAF1cQB+Ak8AAAAHIA5YyCoL9Xh4c3EAfgA4cQB+AjBwcHNxAH4CSQAAABFzcQB+Akz///////////////7////+AAAAAXVxAH4CTwAAAAcf1RhW+3v4eHhzcQB+ADhxAH4CMHBwc3EAfgJJAAAAEXNxAH4CTP///////////////v////4AAAABdXEAfgJPAAAABx+YVWD/twB4eHNxAH4AOHEAfgIwcHBzcQB+AkkAAAARc3EAfgJM///////////////+/////gAAAAF1cQB+Ak8AAAAHH1gP5ja9E3h4c3EAfgA4cQB+AjBwcHNxAH4CSQAAABBzcQB+Akz///////////////7////+AAAAAXVxAH4CTwAAAAcDG6DKQ0FreHhzcQB+ADhxAH4CMHBwc3EAfgJJAAAAEXNxAH4CTP///////////////v////4AAAABdXEAfgJPAAAABx7M/WI9Kkt4eHNxAH4AOHEAfgIwcHBzcQB+AkkAAAARc3EAfgJM///////////////+/////gAAAAF1cQB+Ak8AAAAHHoIwWQyRdnh4c3EAfgA4cQB+AjBwcHNxAH4AgT+1w2fD0Nudc3EAfgA4cQB+AjBwcHNxAH4CSQAAABFzcQB+Akz///////////////7////+AAAAAXVxAH4CTwAAAAcd4g64Q8DfeHhzcQB+ADhxAH4CMHBwc3EAfgJJAAAAEXNxAH4CTP///////////////v////4AAAABdXEAfgJPAAAABx2MuiCriSF4eHNxAH4AOHEAfgIwcHBzcQB+AkkAAAARc3EAfgJM///////////////+/////gAAAAF1cQB+Ak8AAAAHHTPjBEYca3h4c3EAfgA4cQB+AjBwcHNxAH4CSQAAABFzcQB+Akz///////////////7////+AAAAAXVxAH4CTwAAAAcc14ljE3q3eHhzcQB+ADhxAH4CMHBwc3EAfgJJAAAAEXNxAH4CTP///////////////v////4AAAABdXEAfgJPAAAABxx3rT0TpBN4eHNxAH4AOHEAfgIwcHBzcQB+AkkAAAAQc3EAfgJM///////////////+/////gAAAAF1cQB+Ak8AAAAHAs7UqDpCcnh4c3EAfgA4cQB+AjBwcHNxAH4CSQAAABFzcQB+Akz///////////////7////+AAAAAXVxAH4CTwAAAAcbrW1irFfdeHhzcQB+ADhxAH4CMHBwc3EAfgJJAAAAEXNxAH4CTP///////////////v////4AAAABdXEAfgJPAAAABxtDCa5E4kt4eHNxAH4AOHEAfgIwcHBzcQB+AkkAAAARc3EAfgJM///////////////+/////gAAAAF1cQB+Ak8AAAAHGtUjdRA3wXh4c3EAfgA4cQB+AjBwcHNxAH4CSQAAABFzcQB+Akz///////////////7////+AAAAAXVxAH4CTwAAAAcaY7q3DlhEeHhzcQB+ADhxAH4CMHBwc3EAfgJJAAAAEHNxAH4CTP///////////////v////4AAAABdXEAfgJPAAAABwKX4Yuf7S54eHNxAH4AOHEAfgIwcHBzcQB+AkkAAAARc3EAfgJM///////////////+/////gAAAAF1cQB+Ak8AAAAHGXZhrKL6WXh4c3EAfgA4cQB+AjBwcHNxAH4CSQAAABBzcQB+Akz///////////////7////+AAAAAXVxAH4CTwAAAAcCf3G80oxleHhzcQB+ADhxAH4CMHBwc3EAfgJJAAAAEXNxAH4CTP///////////////v////4AAAABdXEAfgJPAAAABxh6/o8CyJR4eHNxAH4AOHEAfgIwcHB
|
this.state;
this.setState({
showCsvExport: !showCsvExport,
});
}
_onCancelEdit = () => {
const { oldWidget } = this.state;
if (oldWidget) {
const { id } = this.props;
WidgetActions.update(id, oldWidget);
}
this._onToggleEdit();
};
_onWidgetConfigChange = (widgetId, config) => WidgetActions.updateConfig(widgetId, config);
_setLoadingState = (loading: boolean) => this.setState({ loading });
visualize = () => {
const { data, errors, title } = this.props;
if (errors && errors.length > 0) {
return <ErrorWidget errors={errors} />;
}
if (data) {
const { editing } = this.state;
const { id, widget, height, width, fields } = this.props;
const { config, filter } = widget;
const VisComponent = _visualizationForType(widget.type);
return (
<VisComponent config={config}
data={data}
editing={editing}
fields={fields}
filter={filter}
height={height}
onConfigChange={(newWidgetConfig) => this._onWidgetConfigChange(id, newWidgetConfig)}
setLoadingState={this._setLoadingState}
title={title}
toggleEdit={this._onToggleEdit}
type={widget.type}
width={width}
id={id} />
);
}
return <LoadingWidget />;
};
// TODO: Clean up different code paths for normal/edit modes
render() {
const { id, widget, fields, onSizeChange, title, position, onPositionsChange, view } = this.props;
const { editing, loading, showCopyToDashboard, showCsvExport, showMoveWidgetToTab } = this.state;
const { config, type } = widget;
const { focusedWidget, setFocusedWidget } = this.context;
const isFocusedWidget = focusedWidget === id;
const visualization = this.visualize();
if (editing) {
const EditComponent = _editComponentForType(widget.type);
return (
<WidgetColorContext id={id}>
<EditWidgetFrame>
<MeasureDimensions>
<WidgetHeader title={title}
hideDragHandle
loading={loading}
onRename={(newTitle) => TitlesActions.set('widget', id, newTitle)}
editing={editing} />
<EditComponent config={config}
fields={fields}
editing={editing}
id={id}
type={widget.type}
onChange={(newWidgetConfig) => this._onWidgetConfigChange(id, newWidgetConfig)}>
<WidgetErrorBoundary>
{visualization}
</WidgetErrorBoundary>
</EditComponent>
</MeasureDimensions>
<SaveOrCancelButtons onFinish={this._onToggleEdit} onCancel={this._onCancelEdit} />
</EditWidgetFrame>
</WidgetColorContext>
);
}
return (
<WidgetColorContext id={id}>
<WidgetFrame widgetId={id} onSizeChange={onSizeChange}>
<InteractiveContext.Consumer>
{(interactive) => (
<WidgetHeader title={title}
hideDragHandle={!interactive || isFocusedWidget}
loading={loading}
onRename={(newTitle) => TitlesActions.set('widget', id, newTitle)}
editing={editing}>
<WidgetActionsWBar>
<IfInteractive>
<IfDashboard>
<ReplaySearchButton />
</IfDashboard>
<IconButton name={isFocusedWidget ? 'compress-arrows-alt' : 'expand-arrows-alt'}
title={isFocusedWidget ? 'Un-focus widget' : 'Focus this widget'}
onClick={() => setFocusedWidget(id)} />
{!isFocusedWidget && (
<WidgetHorizontalStretch widgetId={widget.id}
widgetType={widget.type}
onStretch={onPositionsChange}
position={position} />
)}
<WidgetActionDropdown>
<MenuItem onSelect={this._onToggleEdit}>Edit</MenuItem>
<MenuItem onSelect={() => this._onDuplicate(id, setFocusedWidget)}>Duplicate</MenuItem>
{type === MessagesWidget.type && <MenuItem onSelect={() => this._onToggleCSVExport()}>Export to CSV</MenuItem>}
<IfSearch>
<MenuItem onSelect={this._onToggleCopyToDashboard}>Copy to Dashboard</MenuItem>
</IfSearch>
<IfDashboard>
<MenuItem onSelect={this._onToggleMoveWidgetToTab}>Move to Page</MenuItem>
</IfDashboard>
<MenuItem divider />
<MenuItem onSelect={() => this._onDelete(widget)}>Delete</MenuItem>
</WidgetActionDropdown>
{showCopyToDashboard && (
<CopyToDashboard widgetId={id}
onSubmit={this._on
|
<issue_start><issue_comment>Title: test(storybook): test added for headings
username_0: - Added cypress test for: Placing cursor in paragraph and changing to header 1, 2, 3. Just wanted the maintainers to review the format and code quality. Once approved I'll add other tests for `Headings`.
### Related Issues
- Issue #14
### Author Checklist
- [x] Ensure you provide a [DCO sign-off](https://github.com/probot/dco#how-it-works) for your commits using the `--signoff` option of git commit.
- [ ] Vital features and changes captured in unit and/or integration tests
- [x] Commits messages follow [AP format](https://github.com/accordproject/techdocs/blob/master/DEVELOPERS.md#commit-message-format)
- [ ] Extend the documentation, if necessary
- [ ] Merging to `master` from `fork:branchname`
- [ ] Manual accessibility test performed
- [ ] Keyboard-only access, including forms
- [ ] Contrast at least WCAG Level A
- [ ] Appropriate labels, alt text, and instructions
<issue_comment>username_1: This looks like a great start, I'll try to review it as soon as I can.
<issue_comment>username_2: @username_0 How you have selected the text since cypress doesn't have any sort of api to select the text.Check to run these tests once again.
<issue_comment>username_0: Nope. I had seen some scripts online, which could do that. I'm focusing on other tests rn as they seem to be easy and can be completed asap. If you're able get any solution do tell.
<issue_comment>username_2: @username_0
```code
cd packages/storybook
npm run test:e2e
```
<issue_comment>username_0: I'll look into it
<issue_comment>username_1: I think this looks amazing! Looks like some of these tests may be currently failing? Can you resolve and then I'll add more reviewers on this?
<issue_comment>username_0: I guess it would have timed out. Because these tests were passing on local repository.
<issue_comment>username_0: It occurs when the storybook fails to load on the browser and thus element is not found. It works well locally.
<issue_comment>username_0: @username_1 this issue here is that the size of headings is being different in different browsers. My local browser for the test is expecting 25px, 20px, 16px for heading 1, 2, 3 respectively. Whereas the ci/cd server expects 25px, 25px, 20px for headings 1, 2, 3 respectively. So if the test are passing on ci-cd server, it is failing on my local machine and vice-versa. I guess we'll have to normalise font-size css for headings. Would this be the right thing to do?
<issue_comment>username_1: I think we should be testing what is in [`ui-markdown-editor/src/utilities/constants`](https://github.com/accordproject/web-components/blob/master/packages/ui-markdown-editor/src/utilities/constants.js), which looks like is what the ci/cd server is expecting?
Maybe even import those values into the test so we know we're testing what is the source of truth for the CSS.
<issue_comment>username_0: @username_1 I'm really sorry for the late response.
I tried importing css values from [`ui-markdown-editor/src/utilities/constants`](https://github.com/accordproject/web-components/blob/master/packages/ui-markdown-editor/src/utilities/constants.js) and it's working. Thank you for the suggestion.
The problem was that with merging PR #302, the sizes of H1 and H2 are now the same the difference being their alignment. And this commit wasn't present in my local repository, so the tests were failing.
Also, I think styling for paragraph should be added in [`ui-markdown-editor/src/utilities/constants`](https://github.com/accordproject/web-components/blob/master/packages/ui-markdown-editor/src/utilities/constants.js), because in [`ui-markdown-editor/src/utilities/constants`](https://github.com/accordproject/web-components/blob/master/packages/ui-markdown-editor/src/FormattingToolbar/StyleFormat/index.js) [style for paragraph](https://github.com/accordproject/web-components/blob/0667d024f07bc6b4293589047c4ad6851eb73bcf/packages/ui-markdown-editor/src/FormattingToolbar/StyleFormat/index.js#L39) is null.
<issue_comment>username
|
skip Number of rows to skip. Leave as -1 for no skip
* @param limit Limit of results. Leave as -1 for no limit
* @param projection List of fields to featch from the table. Leave as null to fetch them all.
* @param each Function to parse each row
*/
public static async findStreamSync(source: string, table: string, filter: GenericFilter, sortBy: string, sortDir: SortDirection, skip: number, limit: number, projection: Set<string>, each: (row: GenericRow) => void): Promise<void> {
return DataAccessObject.getDriver(source).findStreamSync(table, filter, sortBy, sortDir, skip, limit, projection, each);
}
/**
* Counts instances
* @param source Data source name
* @param table Table name
* @param filter Filter to apply
*/
public static async count(source: string, table: string, filter: GenericFilter): Promise<number> {
return DataAccessObject.getDriver(source).count(table, filter);
}
/**
* Aggregation
* @param source Data source name
* @param table Table name
* @param filter Filter to apply
* @param id Primary Key name
* @param field Field to aggregate
*/
public static async sum(source: string, table: string, filter: GenericFilter, id: string, field: string): Promise<number> {
return DataAccessObject.getDriver(source).sum(table, filter, id, field);
}
/**
* Insert instance
* @param source Data source name
* @param table Table name
* @param row Row to insert
*/
public static async insertInstance(source: string, table: string, row: GenericRow) {
return DataAccessObject.getDriver(source).insert(table, makeCopyOfObject(row), null);
}
/**
* Insert instances
* @param source Data source name
* @param table Table name
* @param rows Rows to insert
*/
public static async insertInstances(source: string, table: string, rows: GenericRow[]) {
return DataAccessObject.getDriver(source).batchInsert(table, makeCopyOfObject(rows));
}
/**
* Deletes rows
* @param source Data source name
* @param table Table name
* @param filter Filter to apply
*/
public static async deleteMany(source: string, table: string, filter: GenericFilter): Promise<number> {
return DataAccessObject.getDriver(source).deleteMany(table, filter);
}
/**
* Updates rows
* @param source Data source name
* @param table Table name
* @param filter Filter to apply
* @param updated Updated row
*/
public static async updateMany(source: string, table: string, filter: GenericFilter, updated: GenericRow): Promise<number> {
return DataAccessObject.getDriver(source).updateMany(table, filter, updated);
}
/**
* Atomic increment
* @param source Data source name
* @param table Table name
* @param keyName Key name
* @param keyValue Key value
* @param prop Field to increment
* @param inc Amount to increment
*/
public static async increment(source: string, table: string, keyName: string, keyValue: GenericKeyValue, prop: string, inc: number): Promise<void> {
return DataAccessObject.getDriver(source).increment(table, keyName, keyValue, prop, inc);
}
public source: string;
public table: string;
public pk: string;
public original: any;
public ref: any;
/**
* DAO constructor.
* @param source Data source
* @param table Table name
* @param pk Private key name
* @param document Document
*/
constructor(source: string, table: string, pk: string, row: GenericRow) {
this.source = source;
this.table = table;
this.pk = pk;
this.original = makeCopyOfObject(row);
this.ref = row;
}
public changeRef(row: any) {
this.original = makeCopyOfObject(row);
this.ref = row;
}
/**
* Inserts the document.
*/
public async insert(): Promise<void> {
try {
await DataAccessObject.getDriver(this.source).insert(this.table, makeCopyOfObject(this.ref), this.pk, function (keyVal) {
this.ref[this.pk] = keyVal;
}.bind(this));
} catch (ex) {
return Promise.reject(ex);
}
this.original = makeCopyOfObject(this.ref); // Reset original
return Promise.resolve();
}
/**
*
|
9 17 0 1]
SCHED 17087ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=15 runqueue=167 [8 3 12 14 14 14 4 17 1 6 15 6]
SCHED 18092ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=22 runqueue=209 [18 4 0 17 4 9 0 10 0 3 5 16]
SCHED 19093ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=21 runqueue=198 [16 17 3 6 4 1 14 6 18 18 10 21]
SCHED 20093ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=24 runqueue=232 [7 12 13 5 3 7 5 2 8 1 9 8]
SCHED 21100ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=24 runqueue=206 [16 1 14 9 16 7 16 13 5 1 8 4]
SCHED 22110ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=23 runqueue=249 [8 6 8 3 10 9 12 8 6 11 0 3]
SCHED 23113ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=20 runqueue=223 [0 5 17 2 5 3 0 0 0 14 9 2]
SCHED 24116ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=1 idlethreads=22 runqueue=219 [1 0 9 3 1 2 13 10 1 1 18 12]
SCHED 25124ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=23 runqueue=199 [14 12 15 6 2 1 5 0 11 1 15 0]
SCHED 26126ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=21 runqueue=169 [1 12 6 11 8 5 8 5 5 10 6 4]
SCHED 27126ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=20 runqueue=232 [4 5 9 0 1 1 1 1 5 1 11 4]
SCHED 28129ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=21 runqueue=243 [9 11 7 3 7 14 3 12 2 12 12 1]
SCHED 29136ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=20 runqueue=58 [0 4 0 1 0 0 2 0 0 3 0 0]
SCHED 30144ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=22 runqueue=153 [0 9 7 1 4 2 10 2 5 0 0 1]
SCHED 31151ms: gomaxprocs=12 idleprocs=0 threads=40 spinningthreads=0 idlethreads=22 runqueue=216 [10 4 9 5 14 7 1
|
The feeling of receiving anal. I've only ever been in this situation once (I'm a guy).. and this is a weird story, but here we go I guess. The other people involved don't even remember it (or at least that's what they say, though I think they would've mentioned at least *something* if they had).
So I was a freshman in college and had been invited to my very first college party. I wasn't really sure what to expect because I didn't go to all that many parties in high school, but I dressed up a bit, drove to the house, and upon arrival I was evidently one of the first ones there.
I was sort of confused by this because I had been told that the party would start at 7:00, and it was now 7:45 and I was one of 6 there (4 girls, 2 guys). I, so far, seemed to also be the only one who had bothered to dress up even a little (button-down shirt and some nice pants), everyone else was just wearing casual clothes, more casual than what they'd wear to school even.
The only person I knew there was the host, and everyone was gathered around a table playing poker with a couple of beers so I got dealt in. I'm a fairly talkative guy, and was making good conversation with these new people even for being completely sober. I don't drink beer, but wanted to loosen up a bit with some booze, so I asked if there was anything else to drink (alcoholic) apart from beer, and the host said that someone else was bringing over some spirits soon.
We played cards and lightly flirted around with each other for about an hour, and in that time only 3 more people had shown up, even though the house was set up for a big event with lots of food and even a really nice set of speakers out by the pool that my friend had rented for the occasion.
Though the turnout was no more than 15 at peak, we didn't mind because we were having a good time as a small group hanging out in and by the pool. I ended up getting quite flirty with one girl I had met, and as she (along with everyone else) was getting progressively more drunk beer after beer. The hard liquor had arrived by now, but it was already getting pretty late to start drinking, so I cracked a beer open and forced myself to slowly sip on it as to not make her feel awkward that I was sober.
Eventually I managed to pull her away from the group and started chatting with her one-on-one. She was fairly intoxicated by this point, and was touching me all over. I decided it was time to escalate things, and we walked back inside. Almost immediately once we were out of eye-sight of the others we start making out. I'm not sure who initiated it, but it felt really good (she was a fantastic kisser surprisingly, especially considering that she was considerably drunk). I led her upstairs, though I'd never been to this house before and didn't know where to go.
I eventually found a bedroom with a king-size bed, and we both collapsed inside onto the bed, practically already stripping each other off. I quickly got up and shut the door, but there was no lock. I decided it didn't matter and went back to the girl on the bed. Fast forward about 20 minutes, we're having sex.
Then, after about 10 minutes, the host and another girl with short, black hair burst through the door, both in a rather drunken state making out with each other. I immediately panic and realize that it was *his* room, and that I was having sex in his bed. When he turned and saw me he didn't even look mad, and said (quite slurred) "oh cool, an orgy", and then the two laughed a lot. I looked back to my girl and said "let's get out of here", but she responded "no, this could be fun". I was way too sober to initiate a foursome, but then though 'screw it, I'm in college now'. Within 5 minutes everyone is naked, and the other two seemed to skip foreplay and went straight to sex.
I'll spare you the details of the actual sex because this isn't /r/gonewildstories, but in short it was amazing... though there was a lot of fumbling at times due to three incredibly drunk people around me.
After finishing up the mind-blowing sex we all fell asleep there. Then, in the middle of the night, I'm woken up by some hands sliding down my back. I was pretty tired, but I guess I was up for some midnight head, so I didn't stop her. Then, the hands clasped around my waste, and pulled me closer, surprisingly strong for a woman.
Now know something: I'm sort of bi and have had slight feelings of 'sexual confusion' my whole
|
TAG219:
sll $0, $0, 0
slti $2, $3, 10
mflo $2
mthi $2
TAG220:
mfhi $4
slti $2, $4, 12
mflo $3
andi $1, $2, 7
TAG221:
sb $1, 0($1)
sb $1, 0($1)
xor $3, $1, $1
mtlo $3
TAG222:
mfhi $1
bne $3, $1, TAG223
mflo $1
lh $3, 0($1)
TAG223:
sw $3, -256($3)
sw $3, -256($3)
srl $3, $3, 6
lb $1, 0($3)
TAG224:
lui $3, 11
lui $3, 1
sb $1, 0($1)
bgez $1, TAG225
TAG225:
mfhi $2
lh $3, 0($2)
lbu $2, 0($2)
subu $2, $3, $3
TAG226:
sh $2, 0($2)
lw $1, 0($2)
slti $4, $2, 13
sb $2, 0($2)
TAG227:
addu $3, $4, $4
sltiu $3, $4, 10
mfhi $1
sb $3, 0($3)
TAG228:
mtlo $1
mtlo $1
sllv $1, $1, $1
lui $1, 15
TAG229:
divu $1, $1
mthi $1
addu $1, $1, $1
mfhi $4
TAG230:
mfhi $4
mfhi $3
beq $4, $3, TAG231
sll $0, $0, 0
TAG231:
lui $1, 12
sllv $4, $3, $3
mtlo $3
lui $3, 0
TAG232:
lb $4, 0($3)
mflo $4
multu $3, $4
xor $4, $4, $4
TAG233:
multu $4, $4
sw $4, 0($4)
sh $4, 0($4)
mthi $4
TAG234:
bne $4, $4, TAG235
lh $3, 0($4)
bgez $4, TAG235
sb $3, 0($4)
TAG235:
mtlo $3
sw $3, 0($3)
beq $3, $3, TAG236
mtlo $3
TAG236:
lui $1, 4
lui $4, 1
sw $4, 0($3)
beq $1, $4, TAG237
TAG237:
xor $1, $4, $4
div $4, $4
sll $0, $0, 0
lb $4, 0($1)
TAG238:
add $2, $4, $4
mult $2, $4
blez $2, TAG239
lui $1, 15
TAG239:
bgtz $1, TAG240
lui $2, 8
sh $1, 0($2)
or $1, $1, $2
TAG240:
bgez $1, TAG241
lui $3, 9
beq $1, $3, TAG241
sh $1, 0($1)
TAG241:
lui $4, 3
mflo $2
sll $0, $0, 0
bltz $2, TAG242
TAG242:
lui $4, 3
multu $2, $2
bltz $4, TAG243
sll $3, $4, 8
TAG243:
sll $0, $0, 0
sll $0, $0, 0
blez $2, TAG244
mthi $2
TAG244:
sltu $2,
|
+ outerInstance.phase);
outerInstance.shutdown(GO_TO_OFF);
return;
}
string lastUsedAddress = outerInstance.PushServerAddress;
string addressToUse = lastUsedAddress;
if (!string.ReferenceEquals(controlLink, null) && !outerInstance.ignoreServerAddressCache)
{
controlLink = RequestsHelper.completeControlLink(addressToUse, controlLink);
addressToUse = controlLink;
}
outerInstance.sessionServerAddress = addressToUse;
outerInstance.log.Debug("Address to use after create: " + outerInstance.sessionServerAddress);
if (!lastUsedAddress.Equals(outerInstance.sessionServerAddress))
{
if (outerInstance.@is(CREATING))
{
/*
* Close the WebSocket open because of wsEarlyOpen flag and
* open a new WebSocket using the given control-link.
*
* NB This operation affects only create_session requests.
* Bind_session requests ignore the control-link.
*/
if (outerInstance.log.IsDebugEnabled)
{
outerInstance.log.Debug("Control-Link has changed: " + lastUsedAddress + " -> " + outerInstance.sessionServerAddress);
}
outerInstance.changeControlLink(outerInstance.sessionServerAddress);
}
}
if (keepaliveIntervalDefault > 0)
{
if (outerInstance.isPolling)
{
//on polling sessions the longest inactivity permitted is sent instead of the keepalive setting
outerInstance.options.IdleTimeout = keepaliveIntervalDefault;
}
else
{
outerInstance.options.KeepaliveInterval = keepaliveIntervalDefault;
}
}
if (outerInstance.@is(CREATING))
{
//New session!
if (!string.ReferenceEquals(outerInstance.sessionId, null) && !( outerInstance.sessionId.Equals(newSession) ))
{
// nothing can be trusted here
outerInstance.log.Debug("Unexpected session " + outerInstance.sessionId + " found while initializing " + newSession);
outerInstance.reset();
}
outerInstance.SessionId = newSession;
}
else
{
if (!outerInstance.sessionId.Equals(newSession))
{
outerInstance.log.Error("Bound unexpected session: " + newSession + " (was waiting for " + outerInstance.sessionId + ")");
outerInstance.shutdown(GO_TO_OFF);
return;
}
/* calculate reconnect timeout, i.e. the actual time we spent to send the request and receive the reponse (the roundtirp) */
long spentTime = ( DateTime.Now ).Ticks - outerInstance.sentTime;
//we add to our connectTimeout the spent roundtrip and we'll use that time as next connectCheckTimeout
//ok, we wanna give enough time to the client to connect if necessary, but we should not exaggerate :)
//[obviously if spentTime can't be > this.policyBean.connectTimeout after the first connection,
//but it may grow connection after connection if we give him too much time]
long ct = outerInstance.options.CurrentConnectTimeout;
outerInstance.reconnectTimeout = ( spentTime > ct ? ct : spentTime ) + ct;
}
outerInstance.slowing.startSync(outerInstance.isPolling, outerInstance.isForced, ( DateTime.Now ).Ticks);
onEvent();
if (outerInstance.@is(CREATED))
{
if (outerInstance.recoveryBean.Recovery)
{
/*
* branch reserved for recovery responses
* (i.e. bind_session requests with LS_recovery_from parameter)
*/
outerInstance.recoveryBean.restoreTimeLeft();
}
else
{
/*
* branch reserved for create_session responses
*/
outerInstance.handler.onSessionStart();
outerInstance.subscriptions.onSessionStart();
outerInstance.messages.onSessionStart();
outerInstance.details.SessionId = newSession;
outerInstance.details.ServerInstanceAddress = outerInstance.sessionServerAddress;
if (outerInstance.cachedRequiredBW)
{
outerInstance.sendConstrain(0, null);
outerInstance.cachedRequiredBW = false;
}
}
}
else
{
/*
* branch reserved for bind_session responses (recovery responses excluded)
*/
outerInstance.handler.onSessionBound();
outerInstance.options.resetConnectTimeout();
outerInstance.protocol.DefaultSessionId = newSession;
}
}
public virtual void onLoopReceived(long serverSentPause)
{
if (outerInstance.@is(RECEIVING) || outerInstance.@is(STALLING) || outerInstance.@is(STALLED) || outerInstance.@is(CREATED))
{
if (outerInstance.switchRequired)
{
outerInstance.handler.switchReady(outerInstance.handlerPhase, outerInstance.switchCause, outerInstance.switchForced, false);
}
else if (outerInstance.slowRequired)
{
|
```
(and the same for JS case)
Then this would maintain backward compatibility and allow for async configs.
<issue_comment>username_2: Any update on this?
<issue_comment>username_3: For anyone else in dire need of this feature, I'm sharing a hack that we've been using as a workaround successfully for some time (thanks @username_0 for suggestions where it should be patched).
`ormconfig.ts`:
```typescript
import { ConnectionOptionsReader } from 'typeorm/connection/ConnectionOptionsReader';
function patchAsyncConnectionSetup() {
const prototype = ConnectionOptionsReader.prototype as any;
const original = prototype.normalizeConnectionOptions;
prototype.normalizeConnectionOptions = function(
options: Promise<any> | object,
) {
if ('then' in options) {
return options.then((arg) => original.call(this, arg));
}
return original.call(this, options);
};
}
patchAsyncConnectionSetup();
async function buildConnectionOptions() { ... }
const config = buildConnectionOptions();
export = config;
```
Tested with `typeorm 2.0.18`
<issue_comment>username_4: its also useful when i'm using kms to encrypt secrets and need to decrypt them asynchronously
<issue_comment>username_5: Thanks @username_3 for the hack. I get an error
```
import { ConnectionOptionsReader } from 'typeorm/connection/ConnectionOptionsReader';
^
SyntaxError: Unexpected token {
at Module._compile (internal/modules/cjs/loader.js:723:23)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:789:10)
at Module.load (internal/modules/cjs/loader.js:653:32)
at tryModuleLoad (internal/modules/cjs/loader.js:593:12)
at Function.Module._load (internal/modules/cjs/loader.js:585:3)
at Module.require (internal/modules/cjs/loader.js:692:17)
at require (internal/modules/cjs/helpers.js:25:18)
at Function.PlatformTools.load (/Users/USER/PATH/node_modules/typeorm/platform/PlatformTools.js:108:28)
at ConnectionOptionsReader.<anonymous> (/Users/USER/PATH/node_modules/typeorm/connection/ConnectionOptionsReader.js:121:75)
at step (/Users/USER/PATH/node_modules/tslib/tslib.js:136:27)
```
Here is my `ormconfig.ts`:
```
import { ConnectionOptionsReader } from 'typeorm/connection/ConnectionOptionsReader';
import {
SQL_PORT,
SQL_LOGGING,
SQL_HOST,
SQL_USERNAME,
SQL_PASSWORD,
SQL_DATABASE,
} from 'datasources/sql/constants/config.json';
import EnvKeys from 'constants/env-keys';
import Environments from 'constants/environments';
import { getEnvs } from '@packages/infrastructure';
// This is a hacky patch
// async functions for the ormconfig
/* eslint-disable */
function patchAsyncConnectionSetup() {
const { prototype } = ConnectionOptionsReader;
// @ts-ignore
const original = prototype.normalizeConnectionOptions;
// @ts-ignore
prototype.normalizeConnectionOptions = function(options) {
if ('then' in options) {
// @ts-ignore
return options.then(arg => original.call(this, arg));
}
return original.call(this, options);
};
}
/* eslint-enable */
// @ts
patchAsyncConnectionSetup();
async function buildConnectionOptions() {
const configs = [
EnvKeys.ENV,
SQL_PORT,
SQL_LOGGING,
SQL_HOST,
SQL_USERNAME,
SQL_PASSWORD,
SQL_DATABASE,
];
const [
env,
port,
logging,
host,
username,
password,
database,
] = await Promise.all(configs.map(getEnvs));
const isLocal = env === Environments.LOCAL;
[Truncated]
...config,
},
{
...config,
name: 'seed',
migrationsTableName: 'seeds',
migrations: `src/datasources/sql/seeds/**/*.${extension}`,
cli: {
migrationsDir: `src/datasources/sql/seeds/**/*.${extension}`,
},
},
];
}
const config = buildConnectionOptions();
module.exports = config;
```
When using a simple `ormconfig.json` with the same arguments, it works but not with the `ormconfig.ts`. Any idea what I'm missing?<issue_closed>
<issue_comment>username_7: I am trying to connect with mongoDB atlas,
Thanks to @username_3 I was able to make my ormconfig async and config values are loaded successfully, even then when I am trying to save a data in the mongoDB atlas, it is throwing error
|
import { vec2 } from 'gl-matrix';
import * as path from 'path'
import Camera from './Camera';
import Renderable from '../Renderer/Object/Renderables/Renderable';
import SpriteAnimateRenderable from '../Renderer/Object/Renderables/SpriteAnimateRenderable';
export default class SceneFileParser {
/**
* WebGL Instance
*/
private _webGL: WebGLRenderingContext;
/**
* ResourceMap Instance
*/
private _resourceMap: any;
/**
* Dados do arquivo XML da cena
*/
private sceneXmlDocument: XMLDocument;
/**
* Scripts carregados presentes na cena
*/
private loadedScripts: Array<number|Object> = [];
/**
* Audios carregados presentes na cena
*/
private loadedAudio: Array<string> = [];
/**
* Cameras presentes na cena
*/
private loadedCameras: Array<number|Object> = [];
/**
* Cameras presentes na cena
*/
private loadedGameObjects: any = [];
/**
* Object shader padrão
*/
private _shader: any;
/**
* Object shader padrão
*/
private _textureShader: any;
/**
* Object shader padrão
*/
private _spriteShader: any;
/**
*
*/
private audioManager: any;
/**
* Constructor
* @param webGL
* @param resourceMap
*/
constructor(webGL: WebGLRenderingContext, resourceMap: any) {
this._webGL = webGL;
this._resourceMap = resourceMap;
}
/**
* Retorna o elemento presente no XML da cena
* @param sceneElement
*/
private getSceneElement (sceneElement: string) {
let element = this.sceneXmlDocument.getElementsByTagName(sceneElement);
if (element.length === 0) {
console.error("Warning: Level element:[" + sceneElement + "]: is not found!");
}
return element;
};
/**
* Retorna os scripts presentes na cena
*/
public getLoadedScripts() : Array<number|Object> {
return this.loadedScripts;
}
/**
* Retorna os audios presentes na cena
*/
public getLoadedAudios() : Array<string> {
return this.loadedAudio;
}
/**
* Retorna as cameras presentes na cena
*/
public getLoadedCameras() : Array<number|Object> {
return this.loadedCameras;
}
/**
* Retorna os game objects presentes na cena
*/
public getLoadedGameObjects() : any {
return this.loadedGameObjects;
}
/**
* Seta o shader padrão para a criação dos game objects
* @param shader
* @returns self
*/
public setDefaultShader(shader: any) : this {
this._shader = shader;
return this;
}
/**
* Seta o shader padrão para a criação dos game objects
* @param shader
* @returns self
*/
public setTextureShader(shader: any) : this {
this._textureShader = shader;
return this;
}
/**
* Seta o shader padrão para a criação dos game objects
* @param shader
* @returns self
*/
public setSpriteShader(shader: any) : this {
this._spriteShader = shader;
return this;
}
/**
*
* @param audioManager
*/
public setAudioManager(audioManager: any) : this {
this.audioManager = audioManager;
return this;
}
/**
* Parseia os dados do arquivo XML da cena
* @param sceneName
*/
public parse(sceneName: string) : void {
this.sceneXmlDocument = this._resourceMap.retrieveAsset(sceneName);
this.parseScripts();
this.parseAudio();
this.parseCamera();
this.parseSquares();
}
/**
* Parseia os dados dos scripts no arquivo de XML da cena
*/
public parseScripts() {
let scripts = this.getSceneElement("script");
if (scripts.length < 1) {
return this.getLoadedAudios();
}
for (let index = 0; index < scripts.length; index++) {
let src = scripts[index].getAttribute("src");
let script = require(path.join(__dirname, '../../', src));
this.loadedScripts.push(script);
}
}
/**
* Parseia os dados dos scripts no arquivo de XML da cena
*/
public parseAudio() {
let audio = this.getSceneElement("Audio");
if (audio.length < 1) {
return;
}
for (let index =
|
To Jolly Roger she was all that the world
held of glorious womanhood.
"But--you can't!" he cried desperately. "I've come to tell you things,
Nada. I'm not fit. I'm not what you think I am. I've been livin' a
lie--"
He hesitated, and then lashed himself on to the truth.
"You'll hate me when I tell you, Nada. You think Jed Hawkins is bad.
But the law thinks I'm worse. The police want me. They've wanted me
for years. That's why I came down here, and hid over in Indian Tom's
cabin--near where I first met you. I thought they wouldn't find me away
down here, but they did. That's why Peter and I moved over to the big
rock-pile at the end of the Ridge. I'm--an outlaw. I've done a lot of
bad things--in the eyes of the law, and I'll probably die with a bullet
in me, or in jail. I'm sorry, but that don't help. I'd give my life
to be able to tell you what's in my heart. But I can't. It wouldn't be
square."
He wondered why no change came into the steady blue of her eyes as he
went on with the truth. The pallor was gone from her cheeks. Her lips
seemed redder, and what he was saying did not seem to startle her, or
frighten her.
"Don't you understand, Nada?" he cried. "I'm bad. The police want me.
I'm a fugitive--always running away, always hiding--an outlaw--"
She nodded.
"I know it, Mister Roger," she said quietly. "I heard you tell Peter
that a long time ago. And Mister Cassidy was at our place the day after
you and Peter ran away from Indian Tom's cabin, and I showed him the
way to Father John's, and he told me a lot about you, and he told Father
John a lot more, and it made me awful proud of you, Mister Roger--and I
want to go with you and Peter!"
"Proud!" gasped Jolly Roger. "Proud, of ME--"
She nodded again.
"Mister Cassidy--the policeman--he used just the word you used a minute
ago. He said you was square, even when you robbed other people. He said
he had to get you in jail if he could, but he hoped he never would. He
said he'd like to have a man like you for a brother. And Peter loves
you. And I--"
The color came into her white face.
"I'm goin' with you and Peter," she finished.
Something came to relieve the tenseness of the moment for Jolly Roger.
Peter, nosing in a thick patch of bunch-grass, put out a huge snowshoe
rabbit, and the two crashed in a startling avalanche through the young
jackpines, Peter's still puppyish voice yelling in a high staccato as he
pursued. Jolly Roger turned from Nada, and stared where they had gone.
But he was seeing nothing. He knew the hour of his mightiest fight had
come. In the reckless years of his adventuring he had more than once
faced death. He had starved. He had frozen. He had run the deadliest
gantlets of the elements, of beast, and of man. Yet was the strife in
him now the greatest of all his life. His heart thumped. His brain was
swirling in a vague and chaotic struggle for the mastery of things, and
as he fought with himself--his unseeing eyes fixed on the spot where
Peter and the snowshoe rabbit had disappeared--he heard Nada's voice
behind him, saying again that she was going with him and Peter. In those
seconds he felt himself giving way, and the determined action he
had built up for himself began to crumble like sand. He had made his
confession and in spite of it this young girl he worshipped--sweeter and
purer than the flowers of the forest--was urging herself upon him! And
his soul cried out for him to turn about, and open his arms to her, and
gather her into them for as long as God saw fit to give him freedom and
life.
But still he fought against that mighty urge, dragging reason and right
back fragment by fragment, while Nada stood behind him, her wide-open,
childishly beautiful eyes beginning to comprehend the struggle that was
disrupting the heart of this man who was an outlaw--and her god among
men. And when Jolly Roger turned, his face had aged to the grayness of
stone, and his eyes were dull, and there was a terribly
|
, so you should not use violin to play off-scale sounds \emph{in that case}.
Multilevel models also limit the actual use of parameters. Imagine you heard a recording of a symphonic orchestra. Lots of violins but you figured out that most of them actually play the same melody. So you can get away with using one violin score (sample group average) and assume that most violins play like that (most participants are very close to group average). Any deviations from that group melody are probably mistakes by individual musicians, not the actual melody. Same goes if you hear a choir. Again, many people sing (lots of parameters!) but, mostly, in unison, so you do not need to create an individual score sheet (parameter) for each singer, just one per group of singers.
Wrapping up the metaphor, the more flexible your instrument is, the more variable your performance can be, the easier it is for you to mimic noise and imperfections of the recording that have nothing to do with the piece itself. But when you play it next time, matching the recording with all its noise and distortions perfectly, people who know the piece will be puzzled or may not even recognize it (poor out-of-sample predictions). Adopting the melody for a more limited instrument may make it easier for others to recognize the song! Thus, higher variance in performance accuracy (higher variance of deviance) indicates that you can overfit easily with that instrument (model) and you should be extra careful (impose higher penalty for complexity).
\hypertarget{deviance-information-criterion-dic-and-widely-applicable-information-criterion-waic}{%
\section{Deviance information criterion (DIC) and widely-applicable information criterion (WAIC)}\label{deviance-information-criterion-dic-and-widely-applicable-information-criterion-waic}}
The two are very similar, as both compute the model complexity based on posterior distribution of log likelihood. The key difference is that DIC sums the log likelihood for each model (sample) first and then computes the variance over samples. WAIC computes variance of log likelihood per point and then sums those variances up. In the musical instrument metaphor, for DIC you perform the piece many times (generate many posterior samples), compute accuracy for each performance (deviance for a single sample), and then compute how variable they are. For WAIC, you go note by note (observation by observation). For each note you compute variance over all samples to see how consistent you are in playing it. Then, you sum this up.
\[DIC = -2 \cdot \left( log(p(y|\Theta)) - var(\sum log(p(y|\Theta_i))) \right)\]
\[WAIC = -2 \cdot \left( log(p(y|\Theta)) - \sum_i var(log(p(y_i|\Theta))) \right)\]
The penalty replaces \(k\) in AIC and, therefore, will go into the exponent inside the ratio. Again, same idea, that increase in variance of deviance (either per sample in DIC or per point in WAIC) leads to exponentially increasing estimate of complexity.
WAIC is more stable mathematically and is mode widely applicable (that's what statisticians tell us). Moreover, its advantage is that it explicitly recognizes that not all data points in your sample are equal. Some (outliers) are much harder to predict than others. And it is variance of log likelihood for these points that determines how much your model can overfit. An inflexible model will make a poor but consistent job (triangles don't care about pitch!), whereas a complex model can do anything from spot-on to terrible (violins can do anything). In short, you should use WAIC yourself but recognize DIC when you see it and think of it as somewhat less reliable WAIC, which is still better than AIC or BIC when you use regularizing priors and/or hierarchical models.
\hypertarget{importance-sampling}{%
\section{Importance sampling}\label{importance-sampling}}
Importance sampling is mentioned in the chapter but is never explained, so here is a brief description. The core idea is to pretend that you sample from a distribution you need (but have no access to or sampling from it directly is very inefficient) by sampling from another distribution (the one you have access to and that you can sample efficiently) and ``translating'' the probabilities via \emph{importance ratios}. What does this mean?
Imagine that you want to know an average total score for a given die after you throw it ten times. The procedure is as simple as it gets: you toss the die ten times, record the number you get on each throw, sum them up at the end. Repeat the same toss-ten-times-and-sum-it-up as many times as you want and compute your average. But what if you do not have access to that die because it is \emph{the die} and is kept under lock in
|
<issue_start><issue_comment>Title: Cannot get libva working with vainfo on Debian-8 : "va_openDriver() returns 18" after "va_getDriverName() returns -1"
username_0: Hi,
I am trying to use the QuickSync driver MSDK on a debian-8, and so I need to build libva and some other things.
I am installing vainfo from the intel git repository to check if libva works, and this is what gives me errors.
Here is my configuration :
I install the dependencies
```
coder490a:~$ apt install -y cmake openbox git python3 rsync pkg-config automake curl build-essential autoconf libtool libdrm-dev libX11-dev libgl1-mesa-dev libpciaccess-dev libx11-dev libgl1-mesa-glx xutils-dev
```
Then I download the sources of libva and libva-utils
```
git clone https://github.com/intel/libva.git --branch 2.5.0
git clone https://github.com/intel/libva-utils.git --branch 2.5.0
```
And I build it
```
( mkcd libva && ./autogen.sh && make -j4 && make install )
export PKG_CONFIG_PATH=${PKG_CONFIG_PATH}:/usr/local/lib/pkgconfig
( mkcd libva-utils && ./autogen.sh && make -j4 && make install )
```
And finally I set the variables :
```
export LIBVA_DRIVERS_PATH=/usr/local/lib/dri/
export LIBVA_DRIVER_NAME=iHD
```
Now, when i run ```vainfo```, here is what happen
```
coder490a:~$ vainfo
error: can't connect to X server!
libva info: VA-API version 1.5.0
libva info: va_getDriverName() returns -1
libva info: User requested driver 'iHD'
libva info: Trying to open /usr/local/lib/dri/iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_5
DRM_IOCTL_I915_GEM_APERTURE failed: Invalid argument
Assuming 131072kB available aperture size.
May lead to reduced performance or incorrect rendering.
get chip id failed: -1 [22]
param: 4, val: 0
libva error: /usr/local/lib/dri/iHD_drv_video.so init failed
libva info: va_openDriver() returns 18
vaInitialize failed with error code 18 (invalid parameter),exit
```
```
coder490a:~$ ldd `which vainfo`
linux-vdso.so.1 (0x00007ffdeb55a000)
libva-x11.so.2 => /usr/local/lib/libva-x11.so.2 (0x00007f428b28b000)
libX11.so.6 => /usr/lib/x86_64-linux-gnu/libX11.so.6 (0x00007f428af48000)
libXext.so.6 => /usr/lib/x86_64-linux-gnu/libXext.so.6 (0x00007f428ad36000)
libXfixes.so.3 => /usr/lib/x86_64-linux-gnu/libXfixes.so.3 (0x00007f428ab30000)
libva-drm.so.2 => /usr/local/lib/libva-drm.so.2 (0x00007f428a92e000)
libdrm.so.2 => /usr/lib/x86_64-linux-gnu/libdrm.so.2 (0x00007f428a721000)
libva.so.2 => /usr/local/lib/libva.so.2 (0x00007f428a4fb000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f428a4f6000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f428a336000)
libxcb.so.1 => /usr/lib/x86_64-linux-gnu/libxcb.so.1 (0x00007f428a114000)
/lib64/ld
|
No need at all to feel like an asshole, I have no problem with questions or criticism as long as it’s given reasonably (which you’ve done) so don’t feel bad about that!
First of all, when I talk about how I want to keep things consistent, I want to keep the things consistent that make sense to me, not to keep everything consistent for the sole sake of consistency. Things I typically wanted consistent are the total image resolution, the basic size and location of game art display area, the placement of the major publisher/developer logo beside the SNES logo, the SNES logo, the yellow dots on the right hand side and the grey, 4 oval squares in the bottom right hand corner. Not having all of these consistent would irritate me greatly. That being said, a *LOT* of the things you mentioned are valid, some of them I’ve done on purpose by choice, others are unfortunately unintended due to either my lack of image editing ability, lack of quality source material or me being forced to choose between best of two evils. There are also a few things you’ve mentioned that I intend to fix.
So in regards to publisher/developer, I basically created a separate logo template for any publisher/developer that had 3 or more titles on my list. Any fewer than that and I just used the official Nintendo seal, it just wasn’t worth my time to make templates or add logos for one or two specific covers. Second of all I typically opted to use the more popular logo between non-Nintendo publisher/developer (Which is why I used ENIX for Actraiser instead of Quintet). If it was between Nintendo and non-Nintendo I typically went with the non-Nintendo if they had 3 or more titles in my list (which is why I went with Rare for DKC). Sooo.. there’s a method to my madness in this regard but it does create notable inconsistencies, which I am personally happy with and prefer for my SNES Mini :)
Next on your list are bad crops and inconsistent overlaps. All the bad crops (Biker Mice and Ken Griffey for example) are caused from course art that’s already cropped. I’d never purposely crop something that I have overlap outside the main game art area, the biker mice one is small so it doesn’t really bug me that much, the Earthbound and Ken Griffey ones are atrocious but unless I remove the overlap all together (which I just might end up doing) or find better source art, I have no choice. That brings me to the overlap now. In general I like overlapping pieces on top of logos so where they make sense they will, but I also want the logos to be readable or not look bad when overlapping logos which is why some (like the Tetris attack overlap) actually go under the logos. The DKC art is kind of a beast, the entire box art is foresty which is why it extends out so far, but I wasn’t able to make proper art that covered the entire thing (which is a limit of my source art and image editing ability) That being said the more I look at some of the bad ones (Like Earthbound and Ken Griffey), I will probably remove the overlapping parts all together because they keep looking worse every time I look at them. The dragon quest 1&2 are horrible and don’t follow this logic so I’ll either shrink or eliminate the overlap and probably put it behind the Logos.
The next thing you mention are listing developers/publishers multiple times. Almost all of these (The two you listed) are a result of the source art. I didn’t even realize it with the FF5 art, it’s pure white so I can (and will) easily remove that one, the other ones however are images where I just couldn’t cleanly remove the developer/publisher from the art without removing large chunks of it (Hit the ice for example, I’d rather have two Taito logos than lose the overlapping stick that I like a lot). I’d also really like to remove any bubble text or ads on images but sadly I couldn’t find any relevant source art for some of the boxes without said ads so.. some of them suffer :)
As for artifact pixels, that’s low quality source art and me being bad at image editing! Looking at the Harvest Moon image I can *probably* clean that one up, which other ones did you notice? I might be able to fix a couple of them I had to scrap some source art in favor of different ones because I was just not able to remove all the artifact pixels *and* keep the actual art in a decent looking state (My original Illusion of Gaia was really bad and I needed to find new source art for it). Same thing with aspect ratios where I had to choose between squishing some of the covers or losing parts of it to keep the correct aspect ratio. I generally tried to keep the squishing/stretching to a minimum, but between trying to remove other random stuff I don’t like
|
I’ve been having mood swings these past few days and this is getting ridiculous. Like, I am so happy then after a minute or two would be sad, or angry, or pissed, then laughing or crying. I can’t even understand myself anymore. I have a volatile nature but I’ve never had this kind of sudden change of moods in a very long time. No wonder there are few people who can put up with me and dang! I highly appreciate them.
I just had two cups of Coffee Bean brand 3 in 1 coffee in the span of three hours while figuring out on how to deal with this kind of feelings in a positive and productive way. I attempted to log in to my part time job to divert my attention but to no avail. I also tried to listen to music or read a book, but still unable to brush aside such mixed emotions.
I decided to write. Yes, write. Writing is one of my outlets. It is more than self-expression, it is actually a way of detoxifying my mind, getting rid of the burdens in my heart. Thus, helping me balance my moods.
This post is in response to a monthly blogging event called #mymonthlymemories hosted by a gorgeous blogger, Basant She. You may want to check out her blog entitled – The Socially Anxious Extrovert and if you want to join us, please click here. I find reading people’s memorable experiences refreshing and invigorating.
Talk about what happened last month. Oh, I could describe March as a busier, nastier, and wackier month this year, so far. Now, I am smiling from ear to ear. My heart is full, my mind is a bit twisted. But my soul…is adrift in a stream of doubt and fear. I am happy but I am also sad and a bit anxious. I can’t help feeling this way.
My other work from home job which is am afraid, my main source of income is drifting away. We’ve been forced to a halt for one week now and as of this writing, I am not even sure if there is still work to go back to. Out client is so silent, it’s deafening. I am perceptive but I also hate to assume. I asked him indirectly but he just read my private message and haven’t had any response until now. I can’t help but think that maybe this is his way of telling us to find an alternative job or worst, to prepare us for the inevitable. Still, I am keeping the faith. God makes things work for those who put their trust in Him and who acknowledges His mercy and loving-kindness.
Blessings? I lost count, actually. Our youngest turned 5 on March 14th. No fancy celebration, no crowd, no show offs. Believe me, my children prefer this kind of thing. I just bought a cake, a box of their favorite pizza, and 1.5 liter of Royal True Orange. But of course, a present for the birthday celebrant which is a Nerf Gun with small cars as ammunition. Looking back on how delighted my youngest upon getting his birthday present still puts a smile on my face and still brings warmth in my heart. Truly, God has a way of teaching us to be humble, that even small things could still bring so much happiness.
On the 27th of March, our second princess, graduated in primary level. Thank God, despite of our ill feelings for each other, the father of my children still managed to take part on this memorable event. He called via Skype to greet his daughter and she couldn’t be happier. She’s a Daddy’s girl anyway.
Whoa! This must be the longest post that I’ve had this year. Allow me to add the happiness I have in being able to finish my DIY Hanging Bookshelves before saying hello to April.
This is all about March. Cheers to another promising month this April, God’s willing. I am a little calm now. Thank God for the opportunity to partake in this writing exercise. May we all find and do things that makes us happy to be alive.
It’s past February! Whew, I didn’t even noticed. Well, not until my monthly bills came rushing at my door, lol.
Time to deal with #mymonthlymemories post. Thank God! Last month turned out to be better than January. I still have a few things to share here when it comes to blogging. I only managed to join a Weekly Blogging Event just once entitled I Write Her Weekly Haiku Challenge hosted by Susi. You may want to check out my entry here.
If you want to join us, please click here. This is always fun.
I consider myself a sentimental fool but there are few instances when I hate to look back. There are hurts and pains so unbearable that I wanted to just forget and move on. I could tell people that I finally accepted my fate and I’m moving on. I’m sure most of
|
End of preview. Expand
in Data Studio
- Downloads last month
- 70