text
stringlengths 27
775k
|
---|
module Steps
module Applicant
class HasSolicitorController < Steps::ApplicantStepController
def edit
@form_object = HasSolicitorForm.new(
c100_application: current_c100_application,
has_solicitor: current_c100_application.has_solicitor
)
end
def update
update_and_advance(HasSolicitorForm)
end
end
end
end
|
# syslog-client
This module is a pure JavaScript implementation of the [BSD Syslog Protocol][1].
This module is installed using [node package manager (npm)][2]:
npm install syslog-client
It is loaded using the `require()` function:
var syslog = require("syslog-client");
TCP or UDP clients can then be created to log messages to remote hosts.
var client = syslog.createClient("127.0.0.1");
client.log("example message");
[1]: https://www.ietf.org/rfc/rfc3164.txt
[2]: https://npmjs.org
# Constants
The following sections describe constants exported and used by this module.
## syslog.Transport
This object contains constants for all valid values for the `transport`
attribute passed to the `options` argument for the `createClient()` function.
The following constants are defined in this object:
* `Tcp`
* `Udp`
## syslog.Facility
This object contains constants for all valid values for the `facility`
attribute passed to the `options` argument for the `log()` method on the
`Client` class. The following constants are defined in this object:
* `Kernel` - 0
* `User` - 1
* `System` - 3
* `Audit` - 13
* `Alert` - 14
* `Local0` - 16
* `Local1` - 17
* `Local2` - 18
* `Local3` - 19
* `Local4` - 20
* `Local5` - 21
* `Local6` - 22
* `Local7` - 23
## syslog.Severity
This object contains constants for all valid values for the `severity`
attribute passed to the `options` argument for the `log()` method on the
`Client` class. The following constants are defined in this object:
* `Emergency` - 0
* `Alert` - 1
* `Critical` - 2
* `Error` - 3
* `Warning` - 4
* `Notice` - 5
* `Informational` - 6
* `Debug` - 7
# Using This Module
All messages are sent using an instance of the `Client` class. This
module exports the `createClient()` function which is used to create
instances of the `Client` class.
## syslog.createClient([target], [options])
The `createClient()` function instantiates and returns an instance of the
`Client` class:
// Default options
var options = {
syslogHostname: os.hostname(),
transport: syslog.Transport.Udp,
port: 514
};
var client = syslog.createClient("127.0.0.1", options);
The optional `target` parameter defaults to `127.0.0.1`. The optional
`options` parameter is an object, and can contain the following items:
* `port` - TCP or UDP port to send messages to, defaults to `514`
* `syslogHostname` - Value to place into the `HOSTNAME` part of the `HEADER`
part of each message sent, defaults to `os.hostname()`
* `tcpTimeout` - Number of milliseconds to wait for a connection attempt to
the specified Syslog target, and the number of milliseconds to wait for
TCP acknowledgements when sending messages using the TCP transport,
defaults to `10000` (i.e. 10 seconds)
* `transport` - Specify the transport to use, can be either
`syslog.Transport.Udp` or `syslog.Transport.Tcp`, defaults to
`syslog.Transport.Udp`
## client.on("close", callback)
The `close` event is emitted by the client when the clients underlying TCP or
UDP socket is closed.
No arguments are passed to the callback.
The following example prints a message to the console when a clients
underlying TCP or UDP socket is closed:
client.on("close", function () {
console.log("socket closed");
});
## client.on("error", callback)
The `error` event is emitted by the client when the clients underlying TCP or
UDP socket emits an error.
The following arguments will be passed to the `callback` function:
* `error` - An instance of the `Error` class, the exposed `message` attribute
will contain a detailed error message.
The following example prints a message to the console when an error occurs
with a clients underlying TCP or UDP socket:
client.on("error", function (error) {
console.error(error);
});
## client.close()
The `close()` method closes the clients underlying TCP or UDP socket. This
will result in the `close` event being emitted by the clients underlying TCP
or UDP socket which is passed through to the client, resulting in the client
also emitting a `close` event.
The following example closes a clients underlying TCP or UDP socket:
client.close();
## client.log(message, [options], [callback])
The `log()` method sends a Syslog message to a remote host.
The `message` parameter is a string containing the message to be logged.
The optional `options` parameter is an object, and can contain the following
items:
* `facility` - Either one of the constants defined in the `syslog.Facility`
object or the facility number to use for the message, defaults to
`syslog.Facility.Local0`
* `severity` - Either one of the constants defined in the `syslog.Severity`
object or the severity number to use for the message, defaults to
`syslog.Severity.Informational`
The `callback` function is called once the message has been sent to the remote
host, or an error occurred. The following arguments will be passed to the
`callback` function:
* `error` - Instance of the `Error` class or a sub-class, or `null` if no
error occurred
Each message sent to the remote host will have a newline character appended
to it, if one is not already appended. Care should be taken to ensure newline
characters are not embedded within the message passed to this method (i.e. not
appearing at the end), as this may cause some syslog relays/servers to
incorrectly parse the message.
The following example sends a message to a remote host:
var options = {
facility: syslog.Facility.Daemon,
severity: syslog.Severity.Critical
};
var message "something is wrong with this daemon!";
client.log(message, options, function(error) {
if (error) {
console.error(error);
} else {
console.log("sent message successfully");
}
});
# Example Programs
Example programs are included under the modules `example` directory.
# Running tests and test coverage
Tests can be run with:
```
npm test
```
Install dev dependencies before running test coverage:
```
npm install --dev
npm run coverage
```
Coverage should be generated into `coverage/lcov-report/index.html`.
# Bugs & Known Issues
None, yet!
Bug reports should be sent to <[email protected]>.
# Changes
## Version 1.0.0 - 31/07/2015
* Initial release
## Version 1.0.1 - 31/07/2015
* Correct typo in README.md
## Version 1.0.2 - 31/07/2015
* Correct typo in README.md :(
## Version 1.0.3 - 01/08/2015
* Correct typo in README.md :( :(
## Version 1.0.4 - 08/08/2015
* Transport error events are not propagated to an error event in the Syslog
client
## Version 1.0.5 - 22/10/2015
* Redundant release
## Version 1.0.6 - 22/10/2015
* Slight formatting error in the README.md file
## Version 1.0.7 - 08/02/2016
* Remove debug `console.dir()` statement accidently left in code
## Version 1.0.8 - 26/08/2016
* Variable `key` in `_expandConstantObject()` missing `var` declaration
## Version 1.0.9 - 30/08/2016
* Added mocha test framework
* Added istanbul test coverage
* Added tests for aprox 89% coverage
* Fixed bug where transports where not being reused
* Fixed bug where some connections would not `close()`
* Made `options` in `.log()` optional as per existing documentation
* Make `cb` in `.log()` optional and update documentation
* Fixed bug where `error` event and `.log` callback wouldn't predictably receive error
* `close` event is now always fired when `.close()` is called, regarless of open connection
# Roadmap
Suggestions and requirements should be sent to <[email protected]>.
# License
Copyright (c) 2015 Stephen Vickers
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
# Author
Stephen Vickers <[email protected]>
|
namespace MFDMF_Models
{
public enum SupportedThrottleTypes
{
WH,
HC
}
}
|
package uk.gov.dvla.vehicles.presentation.common.webserviceclients.acquire
import scala.concurrent.Future
import uk.gov.dvla.vehicles.presentation.common.clientsidesession.TrackingId
trait AcquireService {
def invoke(cmd: AcquireRequestDto, trackingId: TrackingId): Future[(Int, Option[AcquireResponseDto])]
}
|
module NpmApi
class Client
def self.get(path)
base_url = "https://api.npmjs.org/"
url = base_url + path
res = RestClient.get(url)
JSON.parse(res.body)
end
end
end
|
#pragma once
#include "CesiumGeospatial/Ellipsoid.h"
#include "CesiumGeospatial/Library.h"
#include <glm/mat4x4.hpp>
#include <glm/vec3.hpp>
namespace CesiumGeospatial {
/**
* @brief Transforms positions to various reference frames.
*/
class CESIUMGEOSPATIAL_API Transforms final {
public:
/**
* @brief Computes a transformation from east-north-up axes to an
* ellipsoid-fixed reference frame.
*
* Computes a 4x4 transformation matrix from a reference frame with an
* east-north-up axes centered at the provided origin to the provided
* ellipsoid's fixed reference frame. The local axes are defined as: <ul>
* <li>The `x` axis points in the local east direction.</li>
* <li>The `y` axis points in the local north direction.</li>
* <li>The `z` axis points in the direction of the ellipsoid surface normal
* which passes through the position.</li>
* </ul>
*
* @param origin The center point of the local reference frame.
* @param ellipsoid The {@link Ellipsoid} whose fixed frame is used in the
* transformation. Default value: {@link Ellipsoid::WGS84}.
* @return The transformation matrix
*/
static glm::dmat4x4 eastNorthUpToFixedFrame(
const glm::dvec3& origin,
const Ellipsoid& ellipsoid = Ellipsoid::WGS84) noexcept;
};
} // namespace CesiumGeospatial
|
# encoding: UTF-8
# Global requires
require 'multi_json'
# Local requires
require 'gooddata/models/models'
module ProjectHelper
PROJECT_ID = 'we1vvh4il93r0927r809i3agif50d7iz'
PROJECT_URL = "/gdc/projects/#{PROJECT_ID}"
PROJECT_TITLE = 'GoodTravis'
PROJECT_SUMMARY = 'No summary'
def self.get_default_project(opts = { :client => GoodData.connection })
GoodData::Project[PROJECT_ID, opts]
end
def self.delete_old_projects(opts = { :client => GoodData.connection })
projects = opts[:client].projects
projects.each do |project|
next if project.json['project']['meta']['author'] != client.user.uri
next if project.pid == 'we1vvh4il93r0927r809i3agif50d7iz'
begin
puts "Deleting project #{project.title}"
project.delete
rescue e
puts 'ERROR: ' + e.to_s
end
end
end
def self.create_random_user(client)
num = rand(1e7)
login = "gemtest#{num}@gooddata.com"
GoodData::Membership.create({
email: login,
login: login,
first_name: 'the',
last_name: num.to_s,
role: 'editor',
password: CryptoHelper.generate_password,
domain: ConnectionHelper::DEFAULT_DOMAIN
}, client: client)
end
end
|
# frozen_string_literal: true
class UnsubmittedCourseAlertMailer < ApplicationMailer
def self.send_email(alert)
return if !alert.user || alert.user.email.blank?
email(alert).deliver_now
end
def email(alert)
@instructor = alert.user
@name = @instructor.real_name || @instructor.username
@course_url = "https://#{ENV['dashboard_url']}/courses/#{alert.course.slug}"
@classroom_program_manager = SpecialUsers.classroom_program_manager
subject = 'Reminder: Submit your Wiki Education course page'
mail(to: @instructor.email,
reply_to: @classroom_program_manager.email,
subject: subject)
end
end
|
<?php defined('BASEPATH') OR exit('NO idrect script access allowed');
class Login extends CI_Controller{
function __construct(){
parent::__construct();
$this->load->model("post_model");
//$this->load->library('session');
}
function index(){
$this->load->view(login);
}
function validar(){
$data["posts"] = $this->post_model->getPosts();
$iduser = $this->input->post('user');
$password = $this->input->post('password');
$sesion_data = array(
'idusuario' => $iduser,
'contra' => $password
);
//$this->session->userdata($sesion_data);
$this->session->set_userdata($sesion_data);
$data2['name_user'] = $iduser;
if($iduser != "" && $password != ""){
if($iduser == "tecnico"){
if($password == "12345"){
//$this->load->view('welcome_message',$data);
$this->load->view('plantilla/header');
$this->load->view('plantilla/menu_nav',$data2);
$this->load->view('plantilla/sidebar',$data2);
$this->load->view('plantilla/content',$data);
$this->load->view('plantilla/footer');
$this->load->view('plantilla/sidebar2');
$this->load->view('plantilla/scripts');
}
else{
$this->load->view('login');
}
}else{
//alert('Usuario incorrecto');
$this->load->view('login');
}
}
}
}
?>
|
using System;
using JetBrains.Annotations;
using Unity.Collections;
using Unity.Networking.Transport;
using UnityEngine;
namespace Netling
{
public static class StreamExtension
{
public static void WriteBool(ref this DataStreamWriter writer, bool b)
{
writer.WriteByte((byte) (b ? 1 : 0));
}
public static void WriteManagedString(ref this DataStreamWriter writer, [NotNull] string str)
{
if (str == null) throw new ArgumentNullException(nameof(str));
writer.WriteInt(str.Length);
foreach (char c in str)
{
writer.WriteShort((short) c);
}
}
public static void WriteVector3(ref this DataStreamWriter writer, Vector3 vector3)
{
writer.WriteFloat(vector3.x);
writer.WriteFloat(vector3.y);
writer.WriteFloat(vector3.z);
}
public static void WriteQuaternion(ref this DataStreamWriter writer, Quaternion quaternion)
{
writer.WriteFloat(quaternion.x);
writer.WriteFloat(quaternion.y);
writer.WriteFloat(quaternion.z);
writer.WriteFloat(quaternion.w);
}
public static void WriteObjects(ref this DataStreamWriter writer, object[] objects, Type[] types)
{
if (types.Length != objects.Length)
throw new NetException("Cannot serialize objects: wrong number of arguments");
for (var i = 0; i < types.Length; i++)
{
Type type = types[i];
if (type == typeof(int)) writer.WriteInt((int) objects[i]);
else if (type == typeof(uint)) writer.WriteUInt((uint) objects[i]);
else if (type == typeof(bool)) writer.WriteBool((bool) objects[i]);
else if (type == typeof(byte)) writer.WriteByte((byte) objects[i]);
else if (type == typeof(byte[]))
{
writer.WriteInt(((byte[]) objects[i]).Length);
var bytes = new NativeArray<byte>((byte[]) objects[i], Allocator.Temp);
writer.WriteBytes(bytes);
bytes.Dispose();
}
else if (type == typeof(short)) writer.WriteShort((short) objects[i]);
else if (type == typeof(ushort)) writer.WriteUShort((ushort) objects[i]);
else if (type == typeof(char)) writer.WriteShort((short) objects[i]);
else if (type == typeof(float)) writer.WriteFloat((float) objects[i]);
else if (type == typeof(string)) writer.WriteManagedString((string) objects[i]);
else if (type == typeof(Vector3)) writer.WriteVector3((Vector3) objects[i]);
else if (type == typeof(Quaternion)) writer.WriteQuaternion((Quaternion) objects[i]);
else if (typeof(IStreamSerializable).IsAssignableFrom(type))
((IStreamSerializable) objects[i]).Serialize(ref writer);
else throw new NetException($"Cannot serialize rpc argument of type {type}");
}
}
public static bool ReadBool(ref this DataStreamReader reader)
{
return reader.ReadByte() != 0;
}
/// <summary>
/// Reads a <see cref="string"/> from the stream, corresponding to
/// <see cref="WriteManagedString"/>.
/// <br/><br/>
/// As opposed to <see cref="DataStreamReader.ReadString()"/>, this reads
/// a <see cref="string"/> (maximum length <see cref="int.MaxValue"/>) and not a <see cref="NativeString64"/>.
/// </summary>
public static string ReadManagedString(ref this DataStreamReader reader)
{
int length = reader.ReadInt();
var chars = new char[length];
for (var i = 0; i < length; i++)
chars[i] = (char) reader.ReadShort();
return new string(chars);
}
public static Vector3 ReadVector3(ref this DataStreamReader reader)
{
return new Vector3(
reader.ReadFloat(),
reader.ReadFloat(),
reader.ReadFloat());
}
public static Quaternion ReadQuaternion(ref this DataStreamReader reader)
{
return new Quaternion(
reader.ReadFloat(),
reader.ReadFloat(),
reader.ReadFloat(),
reader.ReadFloat());
}
public static object[] ReadObjects(ref this DataStreamReader reader, Type[] types)
{
var objects = new object[types.Length];
for (var i = 0; i < types.Length; i++)
{
Type type = types[i];
if (type == typeof(int)) objects[i] = reader.ReadInt();
else if (type == typeof(bool)) objects[i] = reader.ReadBool();
else if (type == typeof(uint)) objects[i] = reader.ReadUInt();
else if (type == typeof(byte)) objects[i] = reader.ReadByte();
else if (type == typeof(byte[]))
{
var bytes = new NativeArray<byte>(reader.ReadInt(), Allocator.Temp);
reader.ReadBytes(bytes);
objects[i] = bytes.ToArray();
bytes.Dispose();
}
else if (type == typeof(short)) objects[i] = reader.ReadShort();
else if (type == typeof(ushort)) objects[i] = reader.ReadUShort();
else if (type == typeof(char)) objects[i] = (char) reader.ReadUShort();
else if (type == typeof(float)) objects[i] = reader.ReadFloat();
else if (type == typeof(string)) objects[i] = reader.ReadManagedString();
else if (type == typeof(Vector3)) objects[i] = reader.ReadVector3();
else if (type == typeof(Quaternion)) objects[i] = reader.ReadQuaternion();
else if (typeof(IStreamSerializable).IsAssignableFrom(type))
{
objects[i] = Activator.CreateInstance(type);
((IStreamSerializable) objects[i]).Deserialize(ref reader);
}
else throw new NetException($"Cannot deserialize object of type {type}");
}
return objects;
}
public static void DiscardBytes(ref this DataStreamReader reader, int byteCount)
{
var bytes = new NativeArray<byte>(byteCount, Allocator.Temp);
reader.ReadBytes(bytes);
bytes.Dispose();
}
}
}
|
require 'test_helper'
class InstructionsControllerTest < ActionDispatch::IntegrationTest
setup do
@instruction = instructions(:one)
end
test "should get index" do
get instructions_url
assert_response :success
end
test "should get new" do
get new_instruction_url
assert_response :success
end
test "should create instruction" do
assert_difference('Instruction.count') do
post instructions_url, params: { instruction: { } }
end
assert_redirected_to instruction_url(Instruction.last)
end
test "should show instruction" do
get instruction_url(@instruction)
assert_response :success
end
test "should get edit" do
get edit_instruction_url(@instruction)
assert_response :success
end
test "should update instruction" do
patch instruction_url(@instruction), params: { instruction: { } }
assert_redirected_to instruction_url(@instruction)
end
test "should destroy instruction" do
assert_difference('Instruction.count', -1) do
delete instruction_url(@instruction)
end
assert_redirected_to instructions_url
end
end
|
package krasilov.dima.service
import arrow.core.Either
import kotlinx.datetime.*
import krasilov.dima.integration.FonoApi
import krasilov.dima.model.DAO
import krasilov.dima.web.*
import org.jetbrains.exposed.sql.*
import org.jetbrains.exposed.sql.transactions.transaction
import java.time.LocalDateTime
// fixme better to split to bookingsService and devicesService
class ApiService(private val fonoApi: FonoApi, private val dao: DAO) {
fun getAllDevices(): List<DeviceInfoExtended> = transaction {
dao.getAllDevices()
.map { DeviceInfoExtended(it, fonoApi.getDeviceInfo(it.name)) }
}
fun book(bookDevice: BookDevice): Either<Error, DeviceInfoExtended> =
transaction {
val (userId, deviceId) = bookDevice
val device = dao.findDevice(deviceId)
?: return@transaction Either.left(Error(ErrorCode.BT00002, "Device Not Found by ID: $deviceId"))
if (device.available.not()) {
return@transaction Either.left(Error(ErrorCode.BT00002, "Already booked: $device"))
}
dao.bookDevice(userId, deviceId)
val device1 = dao.findDevice(deviceId)!!
Either.right(DeviceInfoExtended(device1, fonoApi.getDeviceInfo(device1.name)))
}
fun returnDevice(returnDevice: ReturnDevice): Either<Error, DeviceInfoExtended> = transaction {
val (userId, deviceId) = returnDevice
val device = dao.findDevice(deviceId)
?: return@transaction Either.left(Error(ErrorCode.BT00002, "Device Not Found by ID: $deviceId"))
if (device.available) {
return@transaction Either.left(Error(ErrorCode.BT00003, "Device is not booked"))
}
if (device.booking?.user?.id != userId) {
return@transaction Either.left(
Error(
ErrorCode.BT00002,
"Device was booked by another user: ${device.booking}"
)
)
}
val now = LocalDateTime.now()
dao.returnDevice(device.booking.id, now)
Either.right(
DeviceInfoExtended(
device.copy(
booking = device.booking.copy(returnedAt = now.toKotlinLocalDateTime())
),
fonoApi.getDeviceInfo(device.name)
)
)
}
fun getBookingsHistory(): List<BookingHistoryRecord> = transaction {
dao.getAllBookings()
}
}
|
import numpy as np, argparse, sys
sys.path.append("..")
from src.RobustPolyfit import robust_polyfit
from scipy.stats import norm
import matplotlib.pyplot as plt
import time
def main():
parser = argparse.ArgumentParser(description="Test the RobustPolyfit module.")
parser.add_argument("noisefrac", type=float, default=0.1, help="The fraction of the generated "
"datapoints that are outliers.")
parser.add_argument("numpoints", type=int, default=100,
help="The total number of datapoints.")
parser.add_argument("sigma", type=float, default=3, help="Scale value for the Cauchy distribution "
"used to generate outliers.")
parser.add_argument("--linear", type=int, help="Run specified number of tests using "
"polyorder 1.")
parser.add_argument("--quad", type=int, help="Run specified number of tests using "
"polyorder 2.")
parser.add_argument("--cubic", type=int, help="Run specified number of tests using "
"polyorder 3.")
parser.add_argument("--siegelslopes", action="store_true", help="Compare on time with "
"the scipy siegelslopes function (polyorder 1 only).")
args = parser.parse_args()
if args.linear is not None:
set_up_test(args.noisefrac, args.numpoints, args.sigma, args.linear,
args.siegelslopes, 1)
elif args.quad is not None:
set_up_test(args.noisefrac, args.numpoints, args.sigma, args.quad,
args.siegelslopes, 2)
elif args.cubic is not None:
set_up_test(args.noisefrac, args.numpoints, args.sigma, args.cubic,
args.siegelslopes, 3)
def set_up_test(noisefrac, numpoints, sigma, num_iter, use_siegel, polyorder):
x, y = [], []
for i in range(num_iter):
xnew = np.linspace(-10,10,numpoints)
#Add Gaussian noise to the randomly generated polynomial.
if polyorder == 1:
slope, intcpt = np.random.uniform(-10,10), np.random.uniform(-10,10)
ynew = slope*xnew + intcpt + norm.rvs(0,1, size=numpoints)
elif polyorder == 2:
slope1, slope2, intcpt = np.random.uniform(-5,5),\
np.random.uniform(-10, 10), np.random.uniform(-10,10)
ynew = slope1*xnew**2 + slope2*xnew + intcpt + norm.rvs(0,1, size=numpoints)
elif polyorder == 3:
slope1, slope2, slope3, intcpt = np.random.uniform(-2,2),\
np.random.uniform(-5, 5), np.random.uniform(-10,10), np.random.uniform(-10,10)
ynew = slope1*xnew**3 + slope2*xnew**2 + slope3*xnew + intcpt + norm.rvs(0,1, size=numpoints)
idx = np.random.choice(numpoints, size=int(noisefrac*numpoints), replace=False)
#Add outliers but in one direction only to create a skewed distribution.
ynew[idx] = ynew[idx] * (1 + np.random.uniform(0.3,4,size=idx.shape[0]))
x.append(xnew)
y.append(ynew)
if polyorder > 1:
modelfit(x, y, use_siegel=False, polyorder=polyorder, noisefrac=noisefrac)
else:
modelfit(x, y, use_siegel, polyorder=polyorder, noisefrac=noisefrac)
def modelfit(x, y, use_siegel, polyorder, noisefrac):
robmods = []
start_time = time.time()
for i in range(len(x)):
robmod = robust_polyfit(polyorder=polyorder)
robmod.fit(x[i], y[i])
robmods.append(robmod)
end_time = time.time()
average_time = (end_time - start_time) / len(x)
if use_siegel:
from scipy.stats import siegelslopes
sslopes, ssints = [], []
start_time = time.time()
for i in range(len(x)):
sslope, sint = siegelslopes(y[i], x[i])
sslopes.append(sslope)
ssints.append(sint)
end_time = time.time()
average_siegel_time = (end_time - start_time) / len(x)
print("Average siegelslopes time: %s"%average_siegel_time)
print("Average fit time: %s"%average_time)
for i in range(len(x)):
fig = plt.figure()
plt.scatter(x[i], y[i], s=10, label="raw data")
plt.plot(x[i], robmods[i].predict(x[i]), color="black", linestyle="dashed",
label="Student T reg fit")
coefs = np.polyfit(x[i], y[i], deg=polyorder)
if polyorder == 1:
plt.plot(x[i], coefs[0]*x[i] + coefs[1], color="red", linestyle="dashed",
label="Standard least squares fit")
if polyorder == 2:
plt.plot(x[i], coefs[0]*x[i]**2 + coefs[1]*x[i] + coefs[2],
color="red", linestyle="dashed",
label="Standard least squares fit")
if polyorder == 3:
plt.plot(x[i], coefs[0]*x[i]**3 + coefs[1]*x[i]**2 + coefs[2]*x[i] + coefs[3],
color="red", linestyle="dashed",
label="Standard least squares fit")
if use_siegel:
plt.plot(x[i], sslopes[i]*x[i] + ssints[i], color="green", linestyle="dashed",
label="siegelslopes fit")
plt.title("Robust regression fit, polyorder %s,\noutlier fraction %s"
%(polyorder, noisefrac))
plt.xlabel("x")
plt.ylabel("y")
plt.legend()
plt.savefig("Polyorder_%s_%s_noisefrac_%s.png"%(polyorder, i, noisefrac))
plt.close()
if __name__ == "__main__":
main()
|
{
This is a module implementing dynamic memory stack.
Running this file will not do anything, run 'stackdemo.pas' instead.
}
Interface
type Item = record
Value: String;
Next: ^Item;
end;
procedure Push(Val: String);
function Pop: String;
function CanPop: Boolean;
implementation
var Top: ^Item = nil;
procedure Push(Val: String);
begin
var p: ^Item;
New(p);
p^.Value := Val;
p^.Next := Top;
Top := p;
end;
function Pop: String;
begin
Result := Top^.Value;
Top := Top^.Next;
{No need for Dispose, GC will save us all.}
end;
function CanPop: Boolean;
begin
Result := Top<>nil;
end;
End.
|
<?php
namespace app\index\controller;
use think\Controller;
use \think\Request;
use \think\Db;
class Index extends Controller {
public function index() {
// 网站首页
return view();
}
public function loginMid(){
// 登录处理
$req = request() -> param();
$res = Db::table('user')->where('sid',$req['sid'])->where('password',$req['password'])->find();
// 查询不到账号,或者密码错误,重回登录页
if(count($res) == 0) {
return $this->redirect('login');
}else{
// 如果已经登录,获得学号,并重定向到首页
session('login',$req['sid']);
return $this->redirect('index');
}
}
// 注册处理
public function regMid(){
$req = request() -> param();
$res = Db::table('user')->insert($req);
// 获得网页传来的参数传入数据库 成功后跳转到登录页面
$this->redirect('login');
}
public function meJoin() {
// 查询我加入的
if(!session('login')){
// 如果没登录,跳转到登录页面
$this->redirect('/team/public/index/index/login');
}
// 查询活动详情表中该账号的情况
$res = Db::table('detail')->where('sid',session('login'))->select();
$arr = array();
// 获得活动id,从队伍列表中查询对应的活动
for($i=0;$i<count($res);$i++){
$arr= Db::table('team')->where('tid',$res[$i]['tid'])->select();
}
// dump($arr);
$this->assign('arr',$arr);
return $this->fetch('meJoin');
}
public function joinMe() {
// 加入我的
if(!session('login')){
// 未登录跳转
$this->redirect('/team/public/index/index/login');
}else{
// 查询我创建的队伍信息
$res = Db::table('team')->where('sid',session('login'))->select();
$arr = array();
// 获得队伍id后去详情表查询含有此团队id的人
for($i=0;$i<count($res);$i++){
if((Db::table('detail')->where('tid',$res[$i]['tid'])->select())!=null){
$arr[$i] = Db::table('detail')->where('tid',$res[$i]['tid'])->find();
// 获得活动名
$arr[$i]['actName'] = Db::table('actList')->where('actId',$res[$i]['aid'])->column('actName');
// 获得活动结束时间
$arr[$i]['endTime'] = Db::table('actList')->where('actId',$res[$i]['aid'])->column('endTime');
}
}
// dump($arr);
$this->assign('arr',$arr);
return $this->fetch('joinMe');
}
}
public function login() {
return view('login');
}
public function reg() {
return view('reg');
}
}
|
package main
import (
"flag"
"fmt"
"runtime"
"time"
)
var (
numBurn int
updateInterval int
)
func cpuBurn() {
for {
for i := 0; i < 2147483647; i++ {
}
runtime.Gosched()
}
}
func init() {
flag.IntVar(&numBurn, "n", 0, "number of cores to burn (0 = all)")
flag.IntVar(&updateInterval, "u", 10, "seconds between updates (0 = don't update)")
flag.Parse()
if numBurn <= 0 {
numBurn = runtime.NumCPU()
}
}
func main() {
runtime.GOMAXPROCS(numBurn)
fmt.Printf("Burning %d CPUs/cores\n", numBurn)
for i := 0; i < numBurn; i++ {
go cpuBurn()
}
if updateInterval > 0 {
t := time.Tick(time.Duration(updateInterval) * time.Second)
for secs := updateInterval; ; secs += updateInterval {
<-t
fmt.Printf("%d seconds\n", secs)
}
} else {
select {} // wait forever
}
}
|
import { Component, OnInit, ViewChild, Input } from '@angular/core';
import { MatPaginator, MatSort } from '@angular/material';
import { NoticelistDataSource } from './noticelist-datasource';
import { Notice } from '@app/entity/notice/notice.model';
import { Store } from '@ngrx/store';
import * as fromNotice from '../../entity/notice/notice.reducer';
import { SelectionModel } from '@angular/cdk/collections';
import { Router } from '@angular/router';
@Component({
selector: 'anms-noticelist',
templateUrl: './noticelist.component.html',
styleUrls: ['./noticelist.component.css']
})
export class NoticelistComponent implements OnInit {
@ViewChild(MatPaginator)
paginator: MatPaginator;
@ViewChild(MatSort)
sort: MatSort;
dataSource: NoticelistDataSource;
selection = new SelectionModel<Notice>(true, []);
constructor(
private router: Router,
private store: Store<fromNotice.NoticeState>
) { }
/** Columns displayed in the table. Columns IDs can be added, removed, or reordered. */
displayedColumns = ['select', 'number', 'title', 'readCount'];
ngOnInit() {
this.dataSource = new NoticelistDataSource(
this.paginator,
this.sort,
this.store
);
}
@Input()
NoticeData: Notice[];
masterToggle() {
this.isAllSelected()
? this.selection.clear()
: this.dataSource.data.forEach(row => this.selection.select(row));
}
isAllSelected() {
const numSelected = this.selection.selected.length;
const numRows = this.dataSource.data.length;
return numSelected === numRows;
}
checkboxLabel(row?: Notice): string {
if (!row) {
return `${this.isAllSelected() ? 'select' : 'deselect'} all`;
}
return `${
this.selection.isSelected(row) ? 'deselect' : 'select'
} row ${row.id + 1}`;
}
alert(id: string) {
console.log(id);
this.router.navigate(['/noticedetail', id]);
}
// private navigate(product){
// this.router.navigate(['/participants', product]); //we can send product object as route param
// }
}
|
const utils = require('.');
describe('newRequester', () => {
test('no params', () => {
const req = utils.newRequester();
expect(req).toBeTruthy();
});
});
|
use super::*;
use assert2::assert;
#[test]
fn on_empty_container_builder_increases_len_to_1() {
// Given an empty list
let item = 42_i32;
let expected_len = 1;
let sut = ContainerBuilder::new();
assert!(sut.is_empty());
// When
let result = sut.register_instance(item);
// Then
assert!(result.len() == expected_len);
}
#[test]
fn on_container_with_len_1_increases_len_to_2() {
// Given an list containing one item
let given_item = 42_i32;
let item = Foo { _bar: -7, _baz: "test" };
let expected_len = 2;
let sut = ContainerBuilder::new().register_instance(given_item);
assert!(sut.len() == 1);
// And When
let result = sut.register_instance(item);
// Then
assert!(result.len() == expected_len);
}
|
PLANETS = ('Mercury', 'Venus', 'Earth', 'Mars',
'Jupiter', 'Saturn', 'Uranus', 'Neptune')
def get_planet_name(planet_id):
return PLANETS[planet_id - 1]
|
# golang - graceful shutdown example
Sample implementation of a signal listening shutdown
handling of a go application using context.WithCancel.
|
This is my first trip,
my first foreign trip as a first lady.
Can you believe that?
(Applause)
And while this is not my first visit to the U.K.,
I have to say that I am glad that this is my first official visit.
The special relationship between the United States and the U.K.
is based not only on the relationship between governments,
but the common language and the values that we share,
and I'm reminded of that by watching you all today.
During my visit I've been especially honored
to meet some of Britain's most extraordinary women --
women who are paving the way for all of you.
And I'm honored to meet you,
the future leaders of Great Britain and this world.
And although the circumstances of our lives may seem very distant,
with me standing here as the First Lady of the United States of America,
and you, just getting through school,
I want you to know that we have very much in common.
For nothing in my life's path
would have predicted that I'd be standing here
as the first African-American First Lady
of the United States of America.
There is nothing in my story that would land me here.
I wasn't raised with wealth or resources
or any social standing to speak of.
I was raised on the South Side of Chicago.
That's the real part of Chicago.
And I was the product of a working-class community.
My father was a city worker all of his life,
and my mother was a stay-at-home mom.
And she stayed at home to take care of me and my older brother.
Neither of them attended university.
My dad was diagnosed with multiple sclerosis
in the prime of his life.
But even as it got harder for him to walk
and get dressed in the morning --
I saw him struggle more and more --
my father never complained about his struggle.
He was grateful for what he had.
He just woke up a little earlier and worked a little harder.
And my brother and I were raised with all that you really need:
love, strong values
and a belief that with a good education
and a whole lot of hard work,
that there was nothing that we could not do.
I am an example of what's possible
when girls from the very beginning of their lives
are loved and nurtured by the people around them.
I was surrounded by extraordinary women in my life:
grandmothers, teachers, aunts, cousins, neighbors,
who taught me about quiet strength and dignity.
And my mother, the most important role model in my life,
who lives with us at the White House
and helps to care for our two little daughters,
Malia and Sasha.
She's an active presence in their lives, as well as mine,
and is instilling in them
the same values that she taught me and my brother:
things like compassion, and integrity,
and confidence, and perseverance --
all of that wrapped up in an unconditional love
that only a grandmother can give.
I was also fortunate enough to be cherished and encouraged
by some strong male role models as well,
including my father, my brother, uncles and grandfathers.
The men in my life taught me some important things, as well.
They taught me about what a respectful relationship
should look like between men and women.
They taught me about what a strong marriage feels like:
that it's built on faith and commitment
and an admiration for each other's unique gifts.
They taught me about what it means
to be a father
and to raise a family.
And not only to invest in your own home
but to reach out and help raise kids
in the broader community.
And these were the same qualities
that I looked for in my own husband,
Barack Obama.
And when we first met,
one of the things that I remember is that he took me out on a date.
And his date was to go with him to a community meeting.
(Laughter)
I know, how romantic.
(Laughter)
But when we met, Barack was a community organizer.
He worked, helping people to find jobs
and to try to bring resources into struggling neighborhoods.
As he talked to the residents in that community center,
he talked about two concepts.
He talked about "the world as it is" and "the world as it should be."
And I talked about this throughout the entire campaign.
What he said, that all too often,
is that we accept the distance between those two ideas.
And sometimes we settle for the world as it is,
even when it doesn't reflect our values and aspirations.
But Barack reminded us on that day,
all of us in that room, that we all know
what our world should look like.
We know what fairness and justice and opportunity look like.
We all know.
And he urged the people in that meeting,
in that community,
to devote themselves to closing the gap
between those two ideas,
to work together to try to make the world as it is
and the world as it should be, one and the same.
And I think about that today because I am
reminded and convinced that all of you in this school
are very important parts of closing that gap.
You are the women who will build the world as it should be.
You're going to write the next chapter in history.
Not just for yourselves, but for your generation
and generations to come.
And that's why getting a good education
is so important.
That's why all of this that you're going through --
the ups and the downs, the teachers that you love and the teachers that you don't --
why it's so important.
Because communities and countries and ultimately the world
are only as strong as the health of their women.
And that's important to keep in mind.
Part of that health includes an outstanding education.
The difference between a struggling family and a healthy one
is often the presence of an empowered woman
or women at the center of that family.
The difference between a broken community and a thriving one
is often the healthy respect between men and women
who appreciate the contributions each other makes to society.
The difference between a languishing nation
and one that will flourish
is the recognition that we need equal access to education
for both boys and girls.
And this school, named after the U.K.'s first female doctor,
and the surrounding buildings named for Mexican artist Frida Kahlo,
Mary Seacole,
the Jamaican nurse known as the "black Florence Nightingale,"
and the English author, Emily Bronte,
honor women who fought sexism, racism and ignorance,
to pursue their passions to feed their own souls.
They allowed for no obstacles.
As the sign said back there, "without limitations."
They knew no other way to live
than to follow their dreams.
And having done so, these women
moved many obstacles.
And they opened many new doors
for millions of female doctors and nurses
and artists and authors,
all of whom have followed them.
And by getting a good education,
you too can control your own destiny.
Please remember that.
If you want to know the reason why I'm standing here,
it's because of education.
I never cut class. Sorry, I don't know if anybody is cutting class.
I never did it.
I loved getting As.
I liked being smart.
I liked being on time. I liked getting my work done.
I thought being smart was cooler than anything in the world.
And you too, with these same values,
can control your own destiny.
You too can pave the way.
You too can realize your dreams,
and then your job is to reach back
and to help someone just like you do the same thing.
History proves that it doesn't matter
whether you come from a council estate
or a country estate.
Your success will be determined
by your own fortitude,
your own confidence, your own individual hard work.
That is true. That is the reality of the world that we live in.
You now have control over your own destiny.
And it won't be easy -- that's for sure.
But you have everything you need.
Everything you need to succeed,
you already have, right here.
My husband works in this big office.
They call it the Oval Office.
In the White House, there's the desk that he sits at --
it's called the Resolute desk.
It was built by the timber of Her Majesty's Ship Resolute
and given by Queen Victoria.
It's an enduring symbol of the friendship between our two nations.
And its name, Resolute,
is a reminder of the strength of character that's required
not only to lead a country,
but to live a life of purpose, as well.
And I hope in pursuing your dreams, you all remain resolute,
that you go forward without limits,
and that you use your talents -- because there are many; we've seen them;
it's there --
that you use them to create the world as it should be.
Because we are counting on you.
We are counting on every single one of you
to be the very best that you can be.
Because the world is big.
And it's full of challenges.
And we need strong, smart, confident young women
to stand up and take the reins.
We know you can do it. We love you. Thank you so much.
(Applause)
|
<?php
namespace x3ts\mqtt\protocol\constants;
abstract class QoS
{
public const AT_MOST_ONCE = 0;
public const AT_LEAST_ONCE = 1;
public const EXACTLY_ONCE = 2;
}
|
package xyz.fcampbell.rxplayservices.awareness
import android.content.Context
import com.google.android.gms.awareness.Awareness
import com.google.android.gms.awareness.AwarenessOptions
import com.google.android.gms.awareness.FenceApi
import com.google.android.gms.awareness.fence.FenceQueryRequest
import com.google.android.gms.awareness.fence.FenceQueryResult
import com.google.android.gms.awareness.fence.FenceUpdateRequest
import com.google.android.gms.common.api.Scope
import com.google.android.gms.common.api.Status
import io.reactivex.Observable
import xyz.fcampbell.rxplayservices.base.ApiClientDescriptor
import xyz.fcampbell.rxplayservices.base.ApiDescriptor
import xyz.fcampbell.rxplayservices.base.RxPlayServicesApi
/**
* Wraps [Awareness.FenceApi]
*/
@Suppress("unused")
class RxFenceApi(
apiClientDescriptor: ApiClientDescriptor,
options: AwarenessOptions,
vararg scopes: Scope
) : RxPlayServicesApi<FenceApi, AwarenessOptions>(
apiClientDescriptor,
ApiDescriptor(Awareness.API, Awareness.FenceApi, options, *scopes)
) {
constructor(
context: Context,
options: AwarenessOptions,
vararg scopes: Scope
) : this(ApiClientDescriptor(context), options, *scopes)
fun updateFences(fenceUpdateRequest: FenceUpdateRequest): Observable<Status> {
return fromPendingResult { updateFences(it, fenceUpdateRequest) }
}
fun queryFences(fenceQueryRequest: FenceQueryRequest): Observable<FenceQueryResult> {
return fromPendingResult { queryFences(it, fenceQueryRequest) }
}
}
|
Stylebot
==========================
This stylebot sample restyles GitHub using Input.
* Ello restyle, by Patch Hofweber <http://stylebot.me/styles/7134>
Comments? <[email protected]>
|
package gae
import (
"net/http"
"google.golang.org/appengine/blobstore"
"google.golang.org/appengine"
"encoding/csv"
"strings"
"fmt"
)
func ServeHandler(w http.ResponseWriter, r *http.Request) {
ctx := appengine.NewContext(r)
//blobstore.Send(w, appengine.BlobKey(r.FormValue("blobKey")))
out := blobstore.NewReader(ctx, appengine.BlobKey(r.FormValue("blobKey")))
data := csv.NewReader(out)
// csv全出力
cols, err := data.Read()
// 最初の読み込みでエラーが出るなら、Blobは存在しないってよ
if err != nil {
http.Error(w, "blob file doesn't exist", http.StatusNotFound)
return
}
for err == nil {
fmt.Print(cols[0])
w.Write([]byte(cols[0] + " "))
w.Write([]byte(strings.Join(cols, ",")))
w.Write([]byte("\n"))
cols, err = data.Read()
}
}
|
<?php
/**
* DataStructures for PHP
*
* @link https://github.com/SiroDiaz/DataStructures
* @copyright Copyright (c) 2017 Siro Díaz Palazón
* @license https://github.com/SiroDiaz/DataStructures/blob/master/README.md (MIT License)
*/
namespace DataStructures\Sets;
use DataStructures\Trees\Nodes\DisjointNode;
/**
* DisjointSet.
*
* The DisjointSet class represents a disjoint set.
* Operations take in worse case a O(n) except makeSet that takes
* constant time, O(1).
* Basic operations are makeSet, find and union.
*
* @author Siro Diaz Palazon <[email protected]>
*/
class DisjointSet {
private $subsets;
public function __construct() {
$this->subsets = [];
}
/**
* Creates a new set/tree with zero children and parents.
* Its parent points to itself and the rank is 0 when new
* set is created.
*
* @param mixed $data the data to store.
* @return DataStructures\Trees\Nodes\DisjointNode the node created.
*/
public function makeSet($data) : DisjointNode {
$newSet = new DisjointNode($data);
$this->subsets[] = $newSet;
return $newSet;
}
/**
* Returns the representative node (the root of $node in the tree) and
* also applies path compression.
*
* @param DataStructures\Trees\Nodes\DisjointNode $node the node from
* where start to search the root.
* @return DataStructures\Trees\Nodes\DisjointNode the parent node.
*/
public function find($vertex) {
if($this->subsets[$vertex]->parent === null || $this->subsets[$vertex]->parent < 0) {
return $vertex;
}
$this->subsets[$vertex]->parent = $this->find($this->subsets[$vertex]->parent);
return $this->subsets[$vertex]->parent;
}
/**
* Performs the union of two sets (or trees). First looks for
* the root of $x and $y set. Then, if both are in the same tree
* finalize the method. Else, depending of the rank, will join a
* set to other set (The set with lower rank will be append to higher
* one). If both have the same rank it doesn't matter what tree
* is joined to the other tree but the rank will increase.
*
* @param DataStructures\Trees\Nodes\DisjointNode $x The set.
* @param DataStructures\Trees\Nodes\DisjointNode $y The other set.
*/
public function union($vertex1, $vertex2) {
if($this->subsets[$vertex2]->parent < $this->subsets[$vertex1]->parent) {
$this->subsets[$vertex1]->parent = $vertex2;
} else {
if($this->subsets[$vertex1]->parent === $this->subsets[$vertex2]->parent) {
if($this->subsets[$vertex1]->parent === null) {
$this->subsets[$vertex1]->parent = -1;
} else {
$this->subsets[$vertex1]->parent--;
}
}
$this->subsets[$vertex2]->parent = $vertex1;
}
}
}
|
CHMS=/home/chms
for i in $CHMS/*.chm; do
file=$i
dir=${file/\.chm/}
# echo $file
# echo $dir
extract_chmLib $file $dir
done
|
import React, { FC, useEffect, useRef, useCallback } from 'react';
import cx from 'classnames';
import { PDFPageProxy, PDFPageViewport, TextContent, TextContentItem } from 'pdfjs-dist';
import { EventBus } from 'pdfjs-dist/lib/web/ui_utils';
import { TextLayerBuilder } from 'pdfjs-dist/lib/web/text_layer_builder';
import useAsyncFunctionCall from 'utils/useAsyncFunctionCall';
import { PdfDisplayProps } from './types';
type Props = Pick<PdfDisplayProps, 'scale'> & {
className?: string;
/**
* PDF page from pdfjs
*/
loadedPage: PDFPageProxy | null | undefined;
/**
* Callback for text layer info
*/
setRenderedText?: (info: PdfRenderedText | null) => any;
};
export type PdfRenderedText = {
/**
* PDF text content
*/
textContent: TextContent & {
styles: { [styleName: string]: CSSStyleDeclaration };
};
/**
* Text span DOM elements rendered on the text layer
*/
textDivs: HTMLElement[];
/**
* Pdf page viewport used to render text items
*/
viewport: PDFPageViewport;
/**
* Page number, starting at 1
*/
page: number;
};
const PdfViewerTextLayer: FC<Props> = ({
className,
loadedPage,
scale = 1,
setRenderedText = () => {}
}) => {
const textLayerRef = useRef<HTMLDivElement>(null);
const textLayerDiv = textLayerRef.current;
// load text content from the page
const loadedText = useAsyncFunctionCall(
useCallback(async () => {
if (loadedPage) {
const viewport = loadedPage.getViewport({ scale });
const textContent = await loadedPage.getTextContent();
return { textContent, viewport, page: loadedPage.pageNumber, scale };
}
return null;
}, [loadedPage, scale])
);
// render text content
const renderedText = useAsyncFunctionCall(
useCallback(
async (signal: AbortSignal) => {
if (textLayerDiv && loadedText) {
const { textContent, viewport, scale, page } = loadedText;
const builder = new TextLayerBuilder({
textLayerDiv,
viewport,
eventBus: new EventBus(),
pageIndex: page - 1
});
signal.addEventListener('abort', () => builder.cancel());
await _renderTextLayer(builder, textContent, textLayerDiv, scale);
return { textContent, viewport, page, textDivs: builder.textDivs };
}
return undefined;
},
[loadedText, textLayerDiv]
)
);
useEffect(() => {
if (renderedText !== undefined) {
setRenderedText(renderedText);
}
}, [renderedText, setRenderedText]);
const rootClassName = cx(className, `textLayer`);
return (
<div
className={rootClassName}
ref={textLayerRef}
style={{
width: `${loadedText?.viewport?.width ?? 0}px`,
height: `${loadedText?.viewport?.height ?? 0}px`
}}
/>
);
};
/**
* Render text into DOM using the text layer builder
*/
async function _renderTextLayer(
builder: TextLayerBuilder,
textContent: TextContent,
textLayerDiv: HTMLDivElement,
scale: number
) {
builder.setTextContent(textContent);
// render
textLayerDiv.innerHTML = '';
const deferredRenderEndPromise = new Promise(resolve => {
const listener = () => {
resolve();
builder?.eventBus.off('textlayerrendered', listener);
};
builder?.eventBus.on('textlayerrendered', listener);
});
builder.render();
await deferredRenderEndPromise;
_adjustTextDivs(builder.textDivs, textContent.items, scale);
}
/**
* Adjust text span width based on scale
* @param textDivs
* @param textItems
* @param scale
*/
function _adjustTextDivs(
textDivs: HTMLElement[],
textItems: TextContentItem[] | null,
scale: number
): void {
const scaleXPattern = /scaleX\(([\d.]+)\)/;
(textDivs || []).forEach((textDivElm, index) => {
const textItem = textItems?.[index];
if (!textItem) return;
const expectedWidth = textItem.width * scale;
const actualWidth = textDivElm.getBoundingClientRect().width;
function getScaleX(element: HTMLElement) {
const match = element.style.transform?.match(scaleXPattern);
if (match) {
return parseFloat(match[1]);
}
return null;
}
const currentScaleX = getScaleX(textDivElm);
if (currentScaleX && !isNaN(currentScaleX)) {
const newScale = `scaleX(${(expectedWidth / actualWidth) * currentScaleX})`;
textDivElm.style.transform = textDivElm.style.transform.replace(scaleXPattern, newScale);
} else {
const newScale = `scaleX(${expectedWidth / actualWidth})`;
textDivElm.style.transform = newScale;
}
});
}
export default PdfViewerTextLayer;
|
// https://github.com/vuejs/vuejs.org/blob/master/src/v2/examples/vue-20-single-file-components/Hello.vue
module.exports = {
data() {
return {
greeting: 'Hello',
}
},
}
|
Create Database M_InLock
Use M_InLock
Create Table Usuarios
(
UsuarioId Int Primary Key Identity
,Email Varchar (255) Not Null
,Senha Varchar (255) Not Null
,PermissaoDoUsuario Varchar (255) Not Null
);
Create Table Estudios
(
EstudioId Int Primary Key Identity
,NomeEstudio Varchar (255) Not Null
,PaisDeOrigem Varchar (255) Not Null
,DataCriacao Date Not Null
,UsuarioId Int Foreign Key References Usuarios (UsuarioId)
);
Create Table Jogos
(
JogoId Int Primary Key Identity
,NomeJogo Varchar (255) Not Null
,Descricao Text Not Null
,DataLancamento Date Not Null
,Valor Smallmoney Not Null
,EstudioId Int Foreign Key References Estudios (EstudioId)
);
|
import {CreateEventStatement, EventStatus, SyntaxKind} from "../../../parser-node";
import {emitAccountIdentifierOrCurrentUser, emitEventIdentifier} from "../../identifier";
import {StringBuilder} from "../../string-builder";
import {emitStoredProcedureStatement} from "../../stored-procedure-statement";
import {emitStringLiteral} from "../../expression";
import {emitExecuteAtSchedule} from "./execute-at-schedule";
import {emitIntervalSchedule} from "./interval-schedule";
function emitCreateEventStatementStart (statement : CreateEventStatement) : StringBuilder {
const startA = new StringBuilder()
.append("CREATE")
.scope(builder => {
builder
.append(" DEFINER = ")
.appendBuilder(emitAccountIdentifierOrCurrentUser(statement.definer))
})
.append(" EVENT ")
.append(
statement.ifNotExists ?
"IF NOT EXISTS " :
undefined
)
.appendBuilder(emitEventIdentifier(statement.eventIdentifier))
if (!startA.shouldMultiLine()) {
return startA;
}
return new StringBuilder()
.append("CREATE")
.scope(builder => {
builder
.append(" DEFINER = ")
.appendBuilder(emitAccountIdentifierOrCurrentUser(statement.definer))
})
.indent(builder => {
builder
.append("EVENT ")
.append(
statement.ifNotExists ?
"IF NOT EXISTS " :
undefined
)
.appendBuilder(emitEventIdentifier(statement.eventIdentifier))
})
}
export function emitCreateEventStatement (statement : CreateEventStatement) {
let start = emitCreateEventStatementStart(statement);
return start
.indent(builder => {
builder
.append("ON SCHEDULE ")
.appendBuilder(
statement.schedule.syntaxKind == SyntaxKind.ExecuteAtSchedule ?
emitExecuteAtSchedule(statement.schedule) :
emitIntervalSchedule(statement.schedule)
)
})
.indent(builder => {
builder
.append(
statement.onCompletionPreserve ?
"ON COMPLETION PRESERVE" :
"ON COMPLETION NOT PRESERVE"
)
})
.indent(builder => {
builder
.append(
statement.eventStatus == EventStatus.ENABLE ?
"ENABLE" :
statement.eventStatus == EventStatus.DISABLE ?
"DISABLE" :
"DISABLE ON SLAVE"
)
})
.scope(builder => {
if (statement.comment == undefined) {
return;
}
const comment = emitStringLiteral(statement.comment);
builder.indent(builder => {
builder
.append("COMMENT ")
.appendBuilder(comment)
})
})
.indent(builder => {
builder
.append("DO")
.indent(builder => {
builder
.appendBuilder(emitStoredProcedureStatement(statement.statement))
})
})
}
|
<?php
namespace Victoire\Bundle\ConfigBundle\Validator;
use Symfony\Component\Validator\Constraint;
use Symfony\Component\Validator\ConstraintValidator;
class JsonValidator extends ConstraintValidator
{
/**
* @param mixed $value
* @param Constraint $constraint
*
* @throws \Exception
*/
public function validate($value, Constraint $constraint)
{
try {
if (null !== $value) {
$array = json_decode($value, true);
if (!\is_array($array)) {
throw new \Exception();
}
return $array;
}
} catch (\Exception $e) {
$this->context->buildViolation('victoire.config.json.invalid')
->setParameter('{{ value }}', $value)
->addViolation();
}
}
}
|
import { np } from './np';
import { withCache } from './withCache';
export default {
np,
withCache,
};
|
import { Admin } from '../schemas/AdminSchema'
//admin details statically inserted
export const storeAdminDetail = () => {
const adminData = {
admin_name: "Admin",
email: "[email protected]",
password: "12345678",
token: 'dfd8ufejh8bh8idsfsdiduf',
profile_picture: "admin.png"
}
/* whenever you run index file,for all the times it will create a record for admin so to avoid
this conflict we are using this function*/
Admin.findOne({ email: '[email protected]' }).exec((error, result) => {
if (!result) {
const admin = new Admin(adminData);
admin.save();
}
})
}
|
module ActiveAdmin
module Inputs
class SwitchInput < BooleanInput
include ::Adminterface::Extensions::Inputs::Base
def boolean_wrapper_class
"#{super} form-switch".squish
end
end
end
end
|
/**
*
* © Copyright 2017 Greg Symons <[email protected]>.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*
*/
package biz.gsconsulting.add_minutes.scaladsl
import scala.util.parsing.combinator._
/**
* Some constants that I seem to be using everywhere.
*/
trait TimeConstants {
val HOURS_PER_DAY = 24
val MERIDIAN = HOURS_PER_DAY / 2
val MINUTES_PER_HOUR = 60
val MINUTES_PER_DAY = HOURS_PER_DAY * MINUTES_PER_HOUR
}
/**
* A Meridian is a signifier for which half of the day we're in on a 12-hour
* clock: AM or PM
*/
sealed trait Meridian extends TimeConstants
final case object AM extends Meridian {
override def toString: String = "AM"
}
final case object PM extends Meridian {
override def toString: String = "PM"
}
final object Meridian extends Meridian {
/**
* Return the Meridian for a given string
*/
def of(meridian: String) = {
meridian match {
case "AM" => AM
case "PM" => PM
case invalid: String => scala.sys.error("Invalid meridian: [$invalid]")
}
}
/**
* Return the Meridian for a given Instant
*/
def of(instant: Instant) = {
if (instant.hours >= MERIDIAN) PM
else AM
}
}
final case class Instant(minutes: Long) extends TimeConstants {
/**
* The hours portion of the instant on a 24-hour clock (i.e. 0-23)
*/
lazy val hours = (minutes / MINUTES_PER_HOUR) % HOURS_PER_DAY
/**
* The hour portion of the instant on a 12-hour clock (i.e. 1-12)
*/
lazy val hour = {
if (hours == 0 || hours == MERIDIAN) MERIDIAN
else hours % MERIDIAN
}
/**
* The minute portion of the instant
*/
lazy val minute = minutes % MINUTES_PER_HOUR
/**
* The meridian of the instant (i.e. AM or PM)
*/
lazy val meridian = Meridian.of(this)
/**
* Add m minutes to the instant and return a new instant
*/
def addMinutes(m: Long): Instant = {
if (m < 0) {
val lessMinutes = (m % MINUTES_PER_HOUR)
val lessHoursInMinutes = ((m / MINUTES_PER_HOUR) % HOURS_PER_DAY) * MINUTES_PER_HOUR
val totalLessMinutes = lessMinutes + lessHoursInMinutes
// Check for overflow, but the max overflow is 23 hours, 59 minutes because of
// how we calculated totalLessMinutes above, so if we add a day, the modular
// arithmetic on the accessors will give us the correct results. I suppose
// in theory, we could make sure that the final minutes value is in the correct
// range for a single day, but since these Instants don't _really_ have a base
// epoch (or a time zone), it doesn't actually make a difference. Also, we'd have
// to account for it on the other side of the branch, too
if (totalLessMinutes.abs <= minutes) Instant(minutes + totalLessMinutes)
else Instant(minutes + totalLessMinutes + MINUTES_PER_DAY)
}
//Positive modular math is easy...
else Instant(minutes + m)
}
/**
* Format the instant as "[H]H:mm AM|PM"
*/
override def toString = f"${hour}%d:${minute}%02d ${meridian}"
}
/**
* A proper parser for parsing a time in the form "[H]H:mm AM|PM"
*
* Sure, I could've just used a single regex to parse this format, but
* using a parser combinator lets me write nicer error messages that
* show you what specifically was wrong.
*/
final object ParseTime extends RegexParsers
with TimeConstants
{
def hour : Parser[Long] = """(1[0-2]|[1-9])""".r ^^ { _.toLong }
def minutes : Parser[Long] = """[0-5]\d""".r ^^ { _.toLong }
// I suppose I could be more permissive by making the regex case-insensitive, but
// hey, it wasn't in the spec.
def meridian: Parser[Meridian] = """AM|PM""".r ^^ { (m: String) => Meridian.of(m) }
def sep : Parser[String] = ":"
def time : Parser[Instant] = hour ~ sep ~ minutes ~ whiteSpace ~ meridian ^^ {
case h ~ _ ~ m ~ _ ~ AM if h == MERIDIAN => Instant(m)
case h ~ _ ~ m ~ _ ~ PM if h == MERIDIAN => Instant(h * MINUTES_PER_HOUR + m)
case h ~ _ ~ m ~ _ ~ AM => Instant(h * MINUTES_PER_HOUR + m)
case h ~ _ ~ m ~ _ ~ PM => Instant((h + MERIDIAN) * MINUTES_PER_HOUR + m )
}
final case class TimeParseFailure(msg: String, cause: Throwable = null) extends RuntimeException(msg, cause)
override val skipWhitespace = false
def parseException[T](msg: String, input: Input): T = {
throw TimeParseFailure(s"""
| Unable to parse time: [${input.source}]!
|
| $msg:
|
| ${input.pos.longString}
""".stripMargin)
}
def apply(input: String): Instant = {
if (input.isEmpty) throw TimeParseFailure("Input string was empty!")
parseAll(time, input) match {
case Success(result, _) => result
case NoSuccess(msg, next) => parseException(msg, next)
}
}
}
final class TemporalAdder {
def addMinutes(time: String, minutes: Int): String = {
ParseTime(time).addMinutes(minutes).toString
}
}
|
/* The smooth Class Library
* Copyright (C) 1998-2018 Robert Kausch <[email protected]>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of "The Artistic License, Version 2.0".
*
* THIS PACKAGE IS PROVIDED "AS IS" AND WITHOUT ANY EXPRESS OR
* IMPLIED WARRANTIES, INCLUDING, WITHOUT LIMITATION, THE IMPLIED
* WARRANTIES OF MERCHANTIBILITY AND FITNESS FOR A PARTICULAR PURPOSE. */
#include <smooth/gui/window/window.h>
#include <smooth/gui/widgets/layer.h>
#include <smooth/definitions.h>
#include <smooth/gui/widgets/basic/tabwidget.h>
#include <smooth/graphics/surface.h>
const S::Short S::GUI::Layer::classID = S::Object::RequestClassID();
S::GUI::Layer::Layer(const String &name) : Widget(Point(0, 0), Size(32768, 32768))
{
type = classID;
text = name;
orientation = OR_CENTER;
ComputeTextSize();
}
S::GUI::Layer::~Layer()
{
}
S::Int S::GUI::Layer::Paint(Int message)
{
if (!IsRegistered()) return Error();
if (!IsVisible()) return Success();
switch (message)
{
case SP_PAINT:
if (IsBackgroundColorSet())
{
Surface *surface = GetDrawSurface();
surface->Box(Rect(GetRealPosition(), GetRealSize()), GetBackgroundColor(), Rect::Filled);
}
break;
}
return Widget::Paint(message);
}
S::Int S::GUI::Layer::SetMetrics(const Point &iPos, const Size &iSize)
{
if (orientation == OR_CENTER) orientation = OR_FREE;
return Widget::SetMetrics(iPos, iSize);
}
|
---
title: 'Downloading annotations'
linkTitle: 'Downloading annotations'
weight: 18
---
1. To download the latest annotations, you have to save all changes first.
Сlick the `Save` button. There is a `Ctrl+S` shortcut to save annotations quickly.
1. After that, сlick the `Menu` button.
1. Press the `Export task dataset` button.

1. Choose the format for exporting the dataset. Exporting is available in several formats:
- [CVAT for video](/docs/manual/advanced/xml_format/#interpolation)
choose if the task is created in interpolation mode.
- [CVAT for images](/docs/manual/advanced/xml_format/#annotation)
choose if a task is created in annotation mode.

- [PASCAL VOC](http://host.robots.ox.ac.uk/pascal/VOC/)
- [(VOC) Segmentation mask](http://host.robots.ox.ac.uk/pascal/VOC/) —
archive contains class and instance masks for each frame in the png
format and a text file with the value of each color.
- [YOLO](https://pjreddie.com/darknet/yolo/)
- [COCO](http://cocodataset.org/#format-data)
- [TFRecord](https://www.tensorflow.org/tutorials/load_data/tfrecord)
- [MOT](https://motchallenge.net/)
- [LabelMe 3.0](http://labelme.csail.mit.edu/Release3.0/)
- [Datumaro](https://github.com/openvinotoolkit/cvat/tree/develop/cvat/apps/dataset_manager/formats/datumaro)
- [ImageNet](http://www.image-net.org/)
- [CamVid](http://mi.eng.cam.ac.uk/research/projects/VideoRec/CamVid/)
- [WIDER Face](http://shuoyang1213.me/WIDERFACE/)
- [VGGFace2](https://github.com/ox-vgg/vgg_face2)
- [Market-1501](https://www.aitribune.com/dataset/2018051063)
- [ICDAR13/15](https://rrc.cvc.uab.es/?ch=2)
For 3D tasks, the following formats are available:
- [Kitti Raw Format 1.0](http://www.cvlibs.net/datasets/kitti/raw_data.php)
- Sly Point Cloud Format 1.0 - Supervisely Point Cloud dataset
1. To download images with the dataset tick the `Save images` box
1. (Optional) To name the resulting archive, use the `Custom name` field.

|
// FULL_JDK
package test
import java.util.AbstractList
public open class ModalityOfFakeOverrides : AbstractList<String>() {
override fun get(index: Int): String {
return ""
}
override val size: Int get() = 0
}
|
### 1.2.3
* Include System.Net.Requests & System.Net.WebClient in references
### 1.2.2
* Enable passing otherFSharpOptions
### 1.2.0
* fable-compiler 2.4.2
### 1.1.8
* fable-compiler 2.3.25
### 1.1.7
* fable-compiler 2.3.24
### 1.1.6
* fable-compiler 2.3.21
### 1.1.5
* fable-compiler 2.3.19
### 1.1.4
* fable-compiler 2.3.12
### 1.1.3
* Don't crash REPL compilation on Babel errors
### 1.1.2
* fable-compiler 2.3.10
### 1.1.1
* Update bundle and Worker dependencies
### 1.0.4
* Fixed optimized patterns
### 1.0.3
* fcs-fable sync
### 1.0.1
* Decimal fixes
### 1.0.0
* Publish stable
### 1.0.0-beta-009
* Anonymous records
### 1.0.0-beta-004
* Compile tests with fable-compiler-js @ncave
### 1.0.0-beta-003
* Fix web worker
### 1.0.0-beta-002
* Update sources and add worker
### 1.0.0-beta-001
* First publish
|
# chemical-symbols
Symbols of the chemical elements.
## Example
``` javascript
var symbols = require('chemical-symbols');
// =>[
// => 'H',
// => 'He',
// => 'Li',
// => ...
// => 'Lv',
// => 'Ts',
// => 'Og'
// =>]
```
## Installation
``` bash
$ npm install chemical-symbols
```
## API
``` javascript
var symbols = require('chemical-symbols');
```
### `symbols`
An _Array_ of chemical symbols (each a _String_). Adding `1` to the index of any
symbol will get you the element's atomic number!
|
// Copyright 2020 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#ifndef ASH_CAPTURE_MODE_CAPTURE_MODE_TYPES_H_
#define ASH_CAPTURE_MODE_CAPTURE_MODE_TYPES_H_
namespace ash {
// Defines the capture type Capture Mode is currently using.
enum class CaptureModeType {
kImage,
kVideo,
};
// Defines the source of the capture used by Capture Mode.
enum class CaptureModeSource {
kFullscreen,
kRegion,
kWindow,
};
// Specifies the capture mode allowance types.
enum class CaptureAllowance {
// Capture mode is allowed.
kAllowed,
// Capture mode is blocked due to admin-enforced Data Leak Prevention policy.
kDisallowedByDlp,
// Capture mode is blocked due to admin-enforced device policy.
kDisallowedByPolicy,
// Video recording is blocked due to app- or content- enforced content
// protection. Applies only to video recording.
kDisallowedByHdcp
};
// The position of the press event during the fine tune phase of a region
// capture session. This will determine what subsequent drag events do to the
// select region.
enum class FineTunePosition {
// The initial press was outside region. Subsequent drags will do nothing.
kNone,
// The initial press was inside the select region. Subsequent drags will
// move the entire region.
kCenter,
// The initial press was on one of the drag affordance circles. Subsequent
// drags will resize the region. These are sorted clockwise starting at the
// top left.
kTopLeft,
kTopCenter,
kTopRight,
kRightCenter,
kBottomRight,
kBottomCenter,
kBottomLeft,
kLeftCenter,
};
} // namespace ash
#endif // ASH_CAPTURE_MODE_CAPTURE_MODE_TYPES_H_
|
<?php
declare(strict_types=1);
namespace Tests\Stub\PHPUnitJSONLogOutput;
use Paraunit\Parser\JSON\Log;
class JSONLogStub
{
public const TWO_ERRORS_TWO_FAILURES = '2Errors2Failures';
public const ALL_GREEN = 'AllGreen';
public const FATAL_ERROR = 'FatalError';
public const SEGFAULT = 'SegFault';
public const ONE_ERROR = 'SingleError';
public const ONE_INCOMPLETE = 'SingleIncomplete';
public const ONE_RISKY = 'SingleRisky';
public const ONE_SKIP = 'SingleSkip';
public const ONE_WARNING = 'SingleWarning';
public const UNKNOWN = 'Unknown';
public const PARSE_ERROR = 'ParseError';
/**
* @throws \Exception
*/
public static function getLogs(string $filename): string
{
return json_decode(self::getCleanOutputFileContent($filename));
}
/**
* @throws \Exception
*/
public static function getCleanOutputFileContent(string $filename): string
{
$fullFilename = __DIR__ . DIRECTORY_SEPARATOR . $filename . '.json';
if (! file_exists($fullFilename)) {
throw new \Exception('Unknown file stub: ' . $filename);
}
/** @var string $rawLog */
$rawLog = file_get_contents($fullFilename);
return self::cleanLog($rawLog);
}
/**
* @return Log[] The normalized log, as an array of JSON objects
*/
private static function cleanLog(string $jsonString): string
{
$splitted = preg_replace('/\}\{/', '},{', $jsonString);
return '[' . $splitted . ']';
}
}
|
<?php
namespace App\Containers\Message\Actions;
use Apiato\Core\Foundation\Facades\Apiato;
use App\Ship\Parents\Actions\Action;
class GetAllMessagesAction extends Action
{
public function run()
{
$messages = Apiato::call('Message@GetAllMessagesTask', [true], [
'ordered',
'groups'
]);
return $messages;
}
}
|
---
bio: My research interests include coronavirus (COVID-19) use R programming.
education:
courses:
- course: Master of Business Anlytics
institution: Monash University
year: 2021
- course: Bachelor of Accounting
institution: La Trobe University
year: 2019
email: "[email protected]"
interests:
- Statistics
- Data visualization
- Shinyapp
organizations:
- name: Monash University
role: Student of Business Analytics
social:
- icon: github
icon_pack: fab
link: https://github.com/Phyllis-Lin
- icon: linkedin
icon_pack: fab
link: https://www.linkedin.com/in/phyllis-lin-b7a0a7179
- icon: facebook
icon_pack: fab
link: https://www.facebook.com/Phyllis.Lpm/
- icon: twitter
icon_pack: fab
link: https://twitter.com/phyllis_lpm
- icon: weibo
icon_pack: fab
link: https://weibo.com/2239780372/profile?rightmod=1&wvr=6&mod=personinfo&is_all=1
superuser: true
title: Peimin Lin
user_groups:
- Principal Investigators
---
Hi! I am Peimin, welcome to my blog! I will continue to update, please keep following me!
|
package dk.cachet.carp.common.application.data.input
import dk.cachet.carp.common.application.data.input.elements.SelectOne
/**
* All default CARP [InputDataType]s.
*/
object CarpInputDataTypes : InputDataTypeList()
{
/**
* The [InputDataType] namespace of all CARP input data type definitions.
*/
const val CARP_NAMESPACE: String = "dk.cachet.carp.input"
internal const val SEX_TYPE_NAME = "$CARP_NAMESPACE.sex"
/**
* Biological sex assigned at birth.
*/
val SEX = add(
inputDataType = InputDataType.fromString( SEX_TYPE_NAME ),
inputElement = SelectOne( "Sex", Sex.values().map { it.toString() }.toSet() ),
dataClass = Sex::class,
inputToData = { Sex.valueOf( it ) },
dataToInput = { it.name }
)
}
|
require "rails_helper"
describe WeeklyIterationSuggestions do
describe "#perform" do
it "calls WeeklyIterationRecommender for each user" do
user = create(:user)
video = create(:recommendable_content).recommendable
recommender = stub_weekly_iteration_recommender_with(
user: user,
sorted_recommendable_videos: [video],
)
WeeklyIterationSuggestions.new([user]).send
expect(recommender).to have_received(:recommend).once
end
end
def stub_weekly_iteration_recommender_with(args)
double("recommender", recommend: true).tap do |recommender|
allow(WeeklyIterationRecommender).
to receive(:new).
with(args).
and_return(recommender)
end
end
end
|
CREATE SCHEMA Test_AssertEquals;
GO
CREATE PROCEDURE Test_AssertEquals.Test_Hello_Hello
AS
BEGIN
EXEC tSQLt.AssertEquals 'hello', 'hello';
END;
GO
CREATE PROCEDURE Test_AssertEquals.Test_NULL_NULL
AS
BEGIN
EXEC tSQLt.AssertEquals NULL, NULL;
END;
GO
CREATE PROCEDURE Test_AssertEquals.Test_Hello_Hallo
AS
BEGIN
EXEC tSQLt.ExpectException 'tSQLt.AssertEquals failed. Expected:<hello>. Actual:<hallo>.';
EXEC tSQLt.AssertEquals 'hello', 'hallo';
END;
GO
CREATE PROCEDURE Test_AssertEquals.Test_ErrorMessage
AS
BEGIN
EXEC tSQLt.ExpectException 'Error message. tSQLt.AssertEquals failed. Expected:<hello>. Actual:<hallo>.';
EXEC tSQLt.AssertEquals 'hello', 'hallo', 'Error message.';
END;
GO
CREATE PROCEDURE Test_AssertEquals.Test_Hello_NULL
AS
BEGIN
EXEC tSQLt.ExpectException 'tSQLt.AssertEquals failed. Expected:<hello>. Actual:<(null)>.';
EXEC tSQLt.AssertEquals 'hello', NULL;
END;
GO
CREATE PROCEDURE Test_AssertEquals.Test_5_5
AS
BEGIN
EXEC tSQLt.AssertEquals 5, 5;
END;
GO
CREATE PROCEDURE Test_AssertEquals.Test_Pi_Pi
AS
BEGIN
EXEC tSQLt.AssertEquals 3.14, 3.14;
END;
GO
CREATE PROCEDURE Test_AssertEquals.Test_5_6
AS
BEGIN
EXEC tSQLt.ExpectException 'tSQLt.AssertEquals failed. Expected:<5>. Actual:<6>.';
EXEC tSQLt.AssertEquals 5, 6;
END;
GO
CREATE PROCEDURE Test_AssertEquals.Test_Pi_Pi1
AS
BEGIN
EXEC tSQLt.ExpectException 'tSQLt.AssertEquals failed. Expected:<3.14>. Actual:<3.141>.';
EXEC tSQLt.AssertEquals 3.14, 3.141;
END;
GO
|
#include "Halide.h"
using namespace Halide;
int main(int argc, char **argv) {
Target target = get_jit_target_from_environment();
Target target_fuzzed = target.with_feature(Target::FuzzFloatStores);
const int size = 1000;
{
// Check some code that should be unaffected
Func f;
Var x;
f(x) = (x - 42.5f) / 16.0f;
f.vectorize(x, 8);
// Pipelines that only use a few significant bits of the float should be unaffected
Buffer<float> im_ref = f.realize(size, target);
Buffer<float> im_fuzzed = f.realize(size, target_fuzzed);
for (int i = 0; i < im_ref.width(); i++) {
// Test for exact floating point equality, which is exactly
// the sort of thing FuzzFloatStores is trying to discourage.
if (im_ref(i) != im_fuzzed(i)) {
printf("Expected exact floating point equality between %10.10g and %10.10g\n", im_ref(i), im_fuzzed(i));
return -1;
}
}
}
{
// Check some code that should be affected
Func f;
Var x;
f(x) = sqrt(x - 42.3333333f) / 17.0f - tan(x);
f.vectorize(x, 8);
Buffer<float> im_ref = f.realize(size, target);
Buffer<float> im_fuzzed = f.realize(size, target_fuzzed);
int differences = 0;
for (int i = 0; i < im_ref.width(); i++) {
// Pipelines that use all the bits should be wrong about half the time
if (im_ref(i) != im_fuzzed(i)) {
differences++;
}
}
if (differences == 0) {
printf("fuzzing float stores should have done something\n");
return -1;
}
if (differences == size) {
printf("fuzzing float stores should not have changed every store\n");
return -1;
}
}
printf("Success!\n");
return 0;
}
|
# task-progress
Shows native OS task progress for node apps (wherever supported).
Could be used for downloaders, load indicators or any kind of metrics.
<img src="http://i.imgur.com/4BpNBHb.png" border="0" />
## Installing
```sh
$ npm install task-progress
```
## Using
Just one function, `progress(n)` where `n` is a number in the range **0-100**.
It removes itself when passed anything outside that range.
```js
var progress = require('task-progress')
process.title = 'My app'
progress(15) // changes title to: `{15%} My app`
progress(100) // resets title back to: `My app`
```
## License
MIT
|
package models.im
import java.util.UUID
import io.suggest.common.empty.OptionUtil
import io.suggest.common.html.HtmlConstants
import io.suggest.compress.MCompressAlgo
import io.suggest.img.crop.MCrop
import io.suggest.img.MImgFormat
import io.suggest.jd.MJdEdgeId
import io.suggest.n2.edge.MEdge
import io.suggest.util.UuidUtil
import monocle.macros.GenLens
/**
* Suggest.io
* User: Konstantin Nikiforov <[email protected]>
* Created: 09.02.18 16:26
* Description: Модель уникального имени/пути/идентификатора файла (не обязательно локального), в частности картинки.
* Подразумевается, что файл может быть производным от некоего исходника, поэтому в идентификаторе
* закладываются поля для формата, IM-трансформаций и т.д.
* Также используется, как id узлов.
*
* До 2020.07.02 - это модель идентификатора картинки.
*/
object MDynImgId {
/** Рандомный id для нового оригинала файла. */
def randomId(): String =
UuidUtil.uuidToBase64( UUID.randomUUID() )
/**
* Сборка id'шников для экземпляров модели, хранящих динамические изображения.
*
* 2018-02-09 В связи с внедрением формата картинок, после rowKeyStr указывается расширение файла файлового формата:
*
* Примеры:
* "afw43faw4ffw" // Оригинальный файл в оригинальном формате.
* "afw43faw4ffw.jpeg?a=x&b=e" // JPEG-дериватив из оригинала "afw43faw4ffw".
*/
def mkMediaId(dynImgId: MDynImgId): String = {
var acc: List[String] = Nil
// Строка с модификаторами.
val qOpt = dynImgId.qOpt
for (q <- qOpt)
acc = "?" :: q :: acc
// Эктеншен формата картинки, если не-оригинал.
for (imgFormat <- dynImgId.imgFormat if dynImgId.hasImgOps)
acc = HtmlConstants.`.` :: imgFormat.fileExt :: acc
// Финальная сборка полного id.
if (acc.isEmpty)
dynImgId.origNodeId
else
(dynImgId.origNodeId :: acc).mkString
}
/** Экстрактор данных [[MDynImgId]] из jd-эджа в связке с обычным эджем.
* Метод не проверяет связно
*
* @param jdId Данные jd-эджа.
* @param medge Связанный MEdge.
* @return
*/
def fromJdEdge(jdId: MJdEdgeId, medge: MEdge): MDynImgId = {
apply(
origNodeId = medge.nodeIds.head,
imgFormat = jdId.outImgFormat,
imgOps = {
var acc = List.empty[ImOp]
for (mcrop <- jdId.crop)
acc ::= AbsCropOp( mcrop )
acc
}
)
}
def rowKeyStr = GenLens[MDynImgId](_.origNodeId)
def imgFormat = GenLens[MDynImgId](_.imgFormat)
def dynImgOps = GenLens[MDynImgId](_.imgOps)
def compressAlgo = GenLens[MDynImgId](_.compressAlgo)
implicit class DynImgIdOpsExt( private val dynImgId: MDynImgId) extends AnyVal {
/** Нащупать crop. Используется скорее как compat к прошлой форме работы с картинками. */
def cropOpt: Option[MCrop] = {
val iter = dynImgId
.imgOps
.iterator
.flatMap {
case AbsCropOp(crop) => crop :: Nil
case _ => Nil
}
OptionUtil.maybe(iter.hasNext)( iter.next() )
}
def isCropped: Boolean = {
dynImgId
.imgOps
.exists { _.isInstanceOf[ImCropOpT] }
}
def hasImgOps: Boolean = dynImgId.imgOps.nonEmpty
def isOriginal = !hasImgOps
def original: MDynImgId = {
if (hasImgOps) dynImgId._originalHolder
else dynImgId
}
def maybeOriginal: Option[MDynImgId] = {
OptionUtil.maybe(hasImgOps)( dynImgId._originalHolder )
}
def imgIdAndOrig: Seq[MDynImgId] = {
val ll0 = LazyList.empty[MDynImgId]
dynImgId #:: {
maybeOriginal.fold(ll0) { d =>
d #:: ll0
}
}
}
def mediaIdAndOrigMediaId: Seq[String] =
imgIdAndOrig.map(_.mediaId)
/** Добавить операции в конец списка операций. */
def addDynImgOps(addDynImgOps: Seq[ImOp]): MDynImgId = {
if (addDynImgOps.isEmpty) {
dynImgId
} else {
val ops2 =
if (dynImgId.imgOps.isEmpty) addDynImgOps
else dynImgId.imgOps ++ addDynImgOps
(MDynImgId.dynImgOps replace ops2)(dynImgId)
}
}
/** Исторически, это column qualifier, который использовался в column-oriented dbms.
* Сейчас используется тоже в качестве id, либо части id.
*/
def qOpt: Option[String] = {
// TODO XXX dynFormat, compressAlgo
OptionUtil.maybe(hasImgOps)( dynImgId.dynImgOpsString )
}
}
}
/** Контейнер данных для идентификации картинки.
*
* @param origNodeId id узла-картинки оригинала. Обычный id узла без dynOps-суффиксов.
* @param imgFormat Динамический формат картинки.
* @param imgOps IM-операции, которые нужно наложить на оригинал с ключом rowKey, чтобы получить
* необходимою картинку.
* @param compressAlgo Опциональный финальный алгоритм сжатия.
* Нужен для сборки SVG, пожатых через brotli или иным алгоритмом.
*/
final case class MDynImgId(
origNodeId : String,
// 2020-07-03 Модель теперь используется не только для картинок. dynFormat опционален.
imgFormat : Option[MImgFormat] = None,
imgOps : Seq[ImOp] = Nil,
compressAlgo : Option[MCompressAlgo] = None
// TODO svgo=()?
) {
// TODO Допилить и активировать ассерты правил применения формата изображения.
// assert( dynFormat.nonEmpty )
lazy val fileName: String = {
val sb: StringBuilder = new StringBuilder(80)
sb.append( origNodeId )
if (imgOps.nonEmpty) {
sb.append('~')
.append( dynImgOpsString )
}
for (imgFmt <- imgFormat) {
sb.append( '.' )
.append( imgFmt.fileExt )
}
for (algo <- compressAlgo) {
sb.append( '.' )
.append( algo.fileExtension )
}
sb.toString()
}
/** Хранилка инстанса оригинала.
* Для защиты от хранения ненужных ссылок на this, тут связка из метода и lazy val. */
private lazy val _originalHolder =
(MDynImgId.dynImgOps replace Nil)(this)
/** id для модели MMedia. */
lazy val mediaId = MDynImgId.mkMediaId(this)
lazy val dynImgOpsString: String = {
// TODO XXX dynFormat, compressAlgo
ImOp.unbindImOps(
keyDotted = "",
value = imgOps,
withOrderInx = false,
)
}
lazy val fsFileName: String = {
if (this.hasImgOps) {
// TODO Тут надо формат дописать?
var r = dynImgOpsString
for (imgFmt <- imgFormat)
r = r + "." + imgFmt.fileExt
r
} else {
"__ORIG__"
}
}
override def toString = fileName
}
|
## Configuration
The specs are configured to run against `bolt://localhost:7687` by default. In order to change this in your local setup you can set the `NEO4J_URL` environment variable on your system to suit your needs.
To make this easier, the neo4j spec suite allows you to add a `.env` file locally to configure this. For example, to set the `NEO4J_URL` you simply need to add a `.env` file that looks like this:
```
NEO4J_URL=bolt://localhost:6998
```
|
//------------------------------------------------------------------------------
// <auto-generated>
// Changes to this file will be lost when the code is regenerated.
// The build server regenerates the code before each build and a pre-build
// step will regenerate the code on each local build.
// </auto-generated>
//------------------------------------------------------------------------------
namespace System.Units
{
public enum VolumeFlowUnit
{
Undefined = 0,
AcreFootPerDay,
AcreFootPerHour,
AcreFootPerMinute,
AcreFootPerSecond,
CentiliterPerDay,
CentiliterPerMinute,
CentiliterPerSecond,
CubicCentimeterPerMinute,
CubicDecimeterPerMinute,
CubicFootPerHour,
CubicFootPerMinute,
CubicFootPerSecond,
CubicMeterPerDay,
CubicMeterPerHour,
CubicMeterPerMinute,
CubicMeterPerSecond,
CubicMillimeterPerSecond,
CubicYardPerDay,
CubicYardPerHour,
CubicYardPerMinute,
CubicYardPerSecond,
DeciliterPerDay,
DeciliterPerMinute,
DeciliterPerSecond,
KiloliterPerDay,
KiloliterPerMinute,
KiloliterPerSecond,
KilousGallonPerMinute,
LiterPerDay,
LiterPerHour,
LiterPerMinute,
LiterPerSecond,
MegaliterPerDay,
MegaukGallonPerSecond,
MicroliterPerDay,
MicroliterPerMinute,
MicroliterPerSecond,
MilliliterPerDay,
MilliliterPerMinute,
MilliliterPerSecond,
MillionUsGallonsPerDay,
NanoliterPerDay,
NanoliterPerMinute,
NanoliterPerSecond,
OilBarrelPerDay,
OilBarrelPerHour,
OilBarrelPerMinute,
OilBarrelPerSecond,
UkGallonPerDay,
UkGallonPerHour,
UkGallonPerMinute,
UkGallonPerSecond,
UsGallonPerDay,
UsGallonPerHour,
UsGallonPerMinute,
UsGallonPerSecond,
}
}
|
module AWSPricing
class EMR
#Elastic MapReduce Service pricing data
#Returns Hash of pricing information
def self.pricing
Base.get('/elasticmapreduce/pricing/pricing-emr')
end
end
end
|
#!/bin/bash
set -e
script_dir=$(dirname "$0")
GIT_COMMITTER_NAME="auto-prettier" GIT_COMMITTER_EMAIL="[email protected]" \
git commit --author="auto-prettier <[email protected]>" \
--all --message="chore: automated prettier" --no-verify
sleep 1
git tag auto/post-prettier
git push origin auto/pre-prettier auto/post-prettier master
"$script_dir"/get-branches.sh | xargs -n 100 "$script_dir"/upgrade-branches.sh
exit 0
|
from collections import defaultdict
from dataclasses import dataclass, field, replace
from typing import Dict, List, Optional, Set, Tuple, Union
from .flow_graph import (
BasicNode,
ConditionalNode,
FlowGraph,
Node,
ReturnNode,
SwitchNode,
TerminalNode,
)
from .options import Options
from .translate import (
BinaryOp,
BlockInfo,
CommaConditionExpr,
Condition,
Expression,
Formatter,
FunctionInfo,
Statement as TrStatement,
SwitchControl,
format_expr,
get_block_info,
simplify_condition,
)
from .types import Type
@dataclass
class Context:
flow_graph: FlowGraph
fmt: Formatter
options: Options
is_void: bool = True
switch_nodes: Dict[SwitchNode, int] = field(default_factory=dict)
case_nodes: Dict[Node, List[Tuple[int, str]]] = field(
default_factory=lambda: defaultdict(list)
)
goto_nodes: Set[Node] = field(default_factory=set)
emitted_nodes: Set[Node] = field(default_factory=set)
has_warned: bool = False
@dataclass
class IfElseStatement:
condition: Condition
if_body: "Body"
else_body: Optional["Body"] = None
def should_write(self) -> bool:
return True
def format(self, fmt: Formatter) -> str:
space = fmt.indent("")
condition = simplify_condition(self.condition)
cond_str = format_expr(condition, fmt)
after_ifelse = f"\n{space}" if fmt.coding_style.newline_after_if else " "
before_else = f"\n{space}" if fmt.coding_style.newline_before_else else " "
with fmt.indented():
if_str = "\n".join(
[
f"{space}if ({cond_str}){after_ifelse}{{",
self.if_body.format(fmt), # has its own indentation
f"{space}}}",
]
)
if self.else_body is not None and not self.else_body.is_empty():
sub_if = self.else_body.get_lone_if_statement()
if sub_if:
sub_if_str = sub_if.format(fmt).lstrip()
else_str = f"{before_else}else {sub_if_str}"
else:
with fmt.indented():
else_str = "\n".join(
[
f"{before_else}else{after_ifelse}{{",
self.else_body.format(fmt),
f"{space}}}",
]
)
if_str = if_str + else_str
return if_str
@dataclass
class SwitchStatement:
jump: SwitchControl
body: "Body"
# If there are multiple switch statements in a single function, each is given a
# unique index starting at 1. This is used in comments to make control flow clear.
index: int
def should_write(self) -> bool:
return True
def format(self, fmt: Formatter) -> str:
lines = []
comments = []
body_is_empty = self.body.is_empty()
if self.index > 0:
comments.append(f"switch {self.index}")
if not self.jump.jump_table:
comments.append("unable to parse jump table")
elif body_is_empty:
comments.append(f"jump table: {self.jump.jump_table.symbol_name}")
suffix = ";" if body_is_empty else " {"
lines.append(
fmt.with_comments(
f"switch ({format_expr(self.jump.control_expr, fmt)}){suffix}", comments
)
)
if not body_is_empty:
with fmt.indented():
lines.append(self.body.format(fmt))
lines.append(fmt.indent("}"))
return "\n".join(lines)
@dataclass
class SimpleStatement:
contents: Optional[Union[str, TrStatement]]
comment: Optional[str] = None
is_jump: bool = False
def should_write(self) -> bool:
return self.contents is not None or self.comment is not None
def format(self, fmt: Formatter) -> str:
if self.contents is None:
content = ""
elif isinstance(self.contents, str):
content = self.contents
else:
content = self.contents.format(fmt)
if self.comment is not None:
comments = [self.comment]
else:
comments = []
return fmt.with_comments(content, comments)
def clear(self) -> None:
self.contents = None
self.comment = None
@dataclass
class LabelStatement:
context: Context
node: Node
def should_write(self) -> bool:
return (
self.node in self.context.goto_nodes or self.node in self.context.case_nodes
)
def format(self, fmt: Formatter) -> str:
lines = []
if self.node in self.context.case_nodes:
for (switch, case_label) in self.context.case_nodes[self.node]:
comments = [f"switch {switch}"] if switch != 0 else []
lines.append(fmt.with_comments(f"{case_label}:", comments, indent=-1))
if self.node in self.context.goto_nodes:
lines.append(f"{label_for_node(self.context, self.node)}:")
return "\n".join(lines)
@dataclass
class DoWhileLoop:
body: "Body"
condition: Condition
def should_write(self) -> bool:
return True
def format(self, fmt: Formatter) -> str:
space = fmt.indent("")
after_do = f"\n{space}" if fmt.coding_style.newline_after_if else " "
cond = format_expr(simplify_condition(self.condition), fmt)
with fmt.indented():
return "\n".join(
[
f"{space}do{after_do}{{",
self.body.format(fmt),
f"{space}}} while ({cond});",
]
)
Statement = Union[
SimpleStatement,
IfElseStatement,
LabelStatement,
SwitchStatement,
DoWhileLoop,
]
@dataclass
class Body:
print_node_comment: bool
statements: List[Statement] = field(default_factory=list)
def extend(self, other: "Body") -> None:
"""Add the contents of `other` into ourselves"""
self.print_node_comment |= other.print_node_comment
self.statements.extend(other.statements)
def add_node(self, node: Node, comment_empty: bool) -> None:
block_info = get_block_info(node)
statements = block_info.statements_to_write()
# Add node header comment
if self.print_node_comment and (statements or comment_empty):
self.add_comment(f"Node {node.name()}")
# Add node contents
for item in statements:
self.statements.append(SimpleStatement(item))
def add_statement(self, statement: Statement) -> None:
self.statements.append(statement)
def add_comment(self, contents: str) -> None:
self.add_statement(SimpleStatement(None, comment=contents))
def add_if_else(self, if_else: IfElseStatement) -> None:
if if_else.if_body.ends_in_jump():
# Transform `if (A) { B; return C; } else { D; }`
# into `if (A) { B; return C; } D;`,
# which reduces indentation to make the output more readable
self.statements.append(replace(if_else, else_body=None))
if if_else.else_body is not None:
self.extend(if_else.else_body)
return
self.statements.append(if_else)
def add_do_while_loop(self, do_while_loop: DoWhileLoop) -> None:
self.statements.append(do_while_loop)
def add_switch(self, switch: SwitchStatement) -> None:
self.add_statement(switch)
def is_empty(self) -> bool:
return not any(statement.should_write() for statement in self.statements)
def ends_in_jump(self) -> bool:
"""
Returns True if the body ends in an unconditional jump (`goto` or `return`),
which may allow for some syntax transformations.
For example, this is True for bodies ending in a ReturnNode, because
`return ...;` statements are marked with is_jump.
This function is conservative: it only returns True if we're
*sure* if the control flow won't continue past the Body boundary.
"""
for statement in self.statements[::-1]:
if not statement.should_write():
continue
return isinstance(statement, SimpleStatement) and statement.is_jump
return False
def get_lone_if_statement(self) -> Optional[IfElseStatement]:
"""If the body consists solely of one IfElseStatement, return it, else None."""
ret: Optional[IfElseStatement] = None
for statement in self.statements:
if statement.should_write():
if not isinstance(statement, IfElseStatement) or ret:
return None
ret = statement
return ret
def elide_empty_returns(self) -> None:
"""Remove `return;` statements from the end of the body.
If the final statement is an if-else block, recurse into it."""
for statement in self.statements[::-1]:
if (
isinstance(statement, SimpleStatement)
and statement.contents == "return;"
):
statement.clear()
if not statement.should_write():
continue
if isinstance(statement, IfElseStatement):
statement.if_body.elide_empty_returns()
if statement.else_body is not None:
statement.else_body.elide_empty_returns()
# We could also do this to SwitchStatements, but the generally
# preferred style is to keep the final return/break
break
def format(self, fmt: Formatter) -> str:
return "\n".join(
statement.format(fmt)
for statement in self.statements
if statement.should_write()
)
def label_for_node(context: Context, node: Node) -> str:
if node.loop:
return f"loop_{node.block.index}"
else:
return f"block_{node.block.index}"
def emit_node(context: Context, node: Node, body: Body) -> bool:
"""
Try to emit a node for the first time, together with a label for it.
The label is only printed if something jumps to it, e.g. a loop.
For return nodes, it's preferred to emit multiple copies, rather than
goto'ing a single return statement.
For other nodes that were already emitted, instead emit a goto.
Since nodes represent positions in assembly, and we use phi's for preserved
variable contents, this will end up semantically equivalent. This can happen
sometimes when early returns/continues/|| are not detected correctly, and
this hints at that situation better than if we just blindly duplicate the block
"""
if node in context.emitted_nodes:
# TODO: Treating ReturnNode as a special case and emitting it repeatedly
# hides the fact that we failed to fold the control flow. Maybe remove?
if not isinstance(node, ReturnNode):
emit_goto(context, node, body)
return False
else:
body.add_comment(
f"Duplicate return node #{node.name()}. Try simplifying control flow for better match"
)
else:
body.add_statement(LabelStatement(context, node))
context.emitted_nodes.add(node)
body.add_node(node, comment_empty=True)
if isinstance(node, ReturnNode):
emit_return(context, node, body)
return True
def emit_goto(context: Context, target: Node, body: Body) -> None:
assert not isinstance(target, TerminalNode), "cannot goto a TerminalNode"
label = label_for_node(context, target)
context.goto_nodes.add(target)
body.add_statement(SimpleStatement(f"goto {label};", is_jump=True))
def add_labels_for_switch(
context: Context, node: SwitchNode, default_node: Optional[Node]
) -> int:
assert node.cases, "jtbl list must not be empty"
switch_index = context.switch_nodes[node]
# Determine offset
offset = 0
switch_control = get_block_info(node).switch_control
if isinstance(switch_control, SwitchControl):
offset = switch_control.offset
# Force hex for case labels if the highest label is above 50, and there are no negative labels
use_hex = context.fmt.coding_style.hex_case or (
offset >= 0 and (len(node.cases) + offset) > 50
)
# Mark which labels we need to emit
if default_node is not None:
# `None` is a sentinel value to mark the `default:` block
context.case_nodes[default_node].append((switch_index, "default"))
for index, target in enumerate(node.cases):
# Do not emit extra `case N:` labels for the `default:` block
if target == default_node:
continue
# Do not emit labels that skip the switch block entirely
if target == node.immediate_postdominator:
continue
case_num = index + offset
case_label = f"case 0x{case_num:X}" if use_hex else f"case {case_num}"
context.case_nodes[target].append((switch_index, case_label))
return switch_index
def is_switch_guard(node: Node) -> bool:
"""Return True if `node` is a ConditionalNode for checking the bounds of a
SwitchNode's control expression. These can usually be combined in the output."""
if not isinstance(node, ConditionalNode):
return False
cond = get_block_info(node).branch_condition
assert cond is not None
switch_node = node.fallthrough_edge
if not isinstance(switch_node, SwitchNode):
return False
switch_block_info = get_block_info(switch_node)
assert switch_block_info.switch_control is not None
# The SwitchNode must have no statements, and the conditional
# from the ConditionalNode must properly check the jump table bounds.
return (
switch_node.parents == [node]
and not switch_block_info.statements_to_write()
and switch_block_info.switch_control.matches_guard_condition(cond)
)
def gather_any_comma_conditions(block_info: BlockInfo) -> Condition:
branch_condition = block_info.branch_condition
assert branch_condition is not None
comma_statements = block_info.statements_to_write()
if comma_statements:
assert not isinstance(branch_condition, CommaConditionExpr)
return CommaConditionExpr(comma_statements, branch_condition)
else:
return branch_condition
def try_make_if_condition(
chained_cond_nodes: List[ConditionalNode], end: Node
) -> Optional[Tuple[Condition, Node, Optional[Node]]]:
"""
Try to express the nodes in `chained_cond_nodes` as a single `Condition` `cond`
to make an if-else statement. `end` is the immediate postdominator of the first
node in `chained_cond_nodes`, and is the node following the if-else statement.
Returns a tuple of `(cond, if_node, else_node)` representing:
```
if (cond) {
goto if_node;
} else {
goto else_node;
}
```
If `else_node` is `None`, then the else block is empty and can be omitted.
This function returns `None` if the topology of `chained_cond_nodes` cannot
be represented by a single `Condition`.
It also returns `None` if `cond` has an outermost && expression with a
`CommaConditionExpr`: these are better represented as nested if statements.
"""
start_node = chained_cond_nodes[0]
if_node = chained_cond_nodes[-1].fallthrough_edge
else_node: Optional[Node] = chained_cond_nodes[-1].conditional_edge
assert else_node is not None
# Check that all edges point "forward" to other nodes in the if statement
# and translate this DAG of nodes into a dict we can easily modify
allowed_nodes = set(chained_cond_nodes) | {if_node, else_node}
node_cond_edges: Dict[ConditionalNode, Tuple[Condition, Node, Node]] = {}
for node in chained_cond_nodes:
if (
node.conditional_edge not in allowed_nodes
or node.fallthrough_edge not in allowed_nodes
):
# Not a valid set of chained_cond_nodes
return None
allowed_nodes.remove(node)
block_info = get_block_info(node)
if node is start_node:
# The first condition in an if-statement will have unrelated
# statements in its to_write list, which our caller will already
# have emitted. Avoid emitting them twice.
cond = block_info.branch_condition
assert isinstance(cond, Condition)
else:
# Otherwise, these statements will be added to the condition
cond = gather_any_comma_conditions(block_info)
node_cond_edges[node] = (cond, node.conditional_edge, node.fallthrough_edge)
# Iteratively (try to) reduce the nodes into a single condition
#
# This is done through a process similar to "Rule T2" used in interval analysis
# of control flow graphs, see ref. slides 17-21 of:
# http://misailo.web.engr.illinois.edu/courses/526-sp17/lec1.pdf
#
# We have already ensured that all edges point forward (no loops), and there
# are no incoming edges to internal nodes from outside the chain.
#
# Pick the first pair of nodes which form one of the 4 possible reducible
# subgraphs, and then "collapse" them together by combining their conditions
# and adjusting their edges. This process is repeated until no more changes
# are possible, and is a success if there is exactly 1 condition left.
while True:
# Calculate the parents for each node in our subgraph
node_parents: Dict[ConditionalNode, List[ConditionalNode]] = {
node: [] for node in node_cond_edges
}
for node in node_cond_edges:
for child in node_cond_edges[node][1:]:
if child not in (if_node, else_node):
assert isinstance(child, ConditionalNode)
node_parents[child].append(node)
# Find the first pair of nodes which form a reducible pair: one will always
# be the *only* parent of the other.
# Note: we do not include `if_node` or `else_node` in this search
for child, parents in node_parents.items():
if len(parents) != 1:
continue
parent = parents[0]
child_cond, child_if, child_else = node_cond_edges[child]
parent_cond, parent_if, parent_else = node_cond_edges[parent]
# The 4 reducible subgraphs, see ref. slides 21-22 of:
# https://www2.cs.arizona.edu/~collberg/Teaching/553/2011/Resources/ximing-slides.pdf
# In summary:
# - The child must have exactly one incoming edge, from the parent
# - The parent's other edge must be in common with one of the child's edges
# - Replace the condition with a combined condition from the two nodes
# - Replace the parent's edges with the child's edges
if parent_if is child_if and parent_else is child:
parent_else = child_else
cond = join_conditions(parent_cond, "||", child_cond)
elif parent_if is child_else and parent_else is child:
parent_else = child_if
cond = join_conditions(parent_cond, "||", child_cond.negated())
elif parent_if is child and parent_else is child_if:
parent_if = child_else
cond = join_conditions(parent_cond, "&&", child_cond.negated())
elif parent_if is child and parent_else is child_else:
parent_if = child_if
cond = join_conditions(parent_cond, "&&", child_cond)
else:
continue
# Modify the graph by replacing `parent`'s condition/edges, and deleting `child`
node_cond_edges[parent] = (cond, parent_if, parent_else)
node_cond_edges.pop(child)
break
else:
# No pair was found, we're done!
break
# Were we able to collapse all conditions from chained_cond_nodes into one?
if len(node_cond_edges) != 1 or start_node not in node_cond_edges:
return None
cond, left_node, right_node = node_cond_edges[start_node]
# Negate the condition if the if/else nodes are backwards
if (left_node, right_node) == (else_node, if_node):
cond = cond.negated()
else:
assert (left_node, right_node) == (if_node, else_node)
# Check if the if/else needs an else block
if else_node is end:
else_node = None
elif if_node is end:
# This is rare, but re-write if/else statements with an empty if body
# from `if (cond) {} else { else_node; }` into `if (!cond) { else_node; }`
cond = cond.negated()
if_node = else_node
else_node = None
# If there is no `else`, then check the conditions in the outermost `&&` expression.
# Complex `&&` conditions are better represented with nested ifs.
if else_node is None:
c: Expression = cond
while isinstance(c, BinaryOp) and c.op == "&&":
if isinstance(c.right, CommaConditionExpr):
# Fail, to try building a shorter conditional expression
return None
c = c.left
return (cond, if_node, else_node)
def build_conditional_subgraph(
context: Context, start: ConditionalNode, end: Node
) -> IfElseStatement:
"""
Output the subgraph between `start` and `end`, including the branch condition
in the ConditionalNode `start`.
This function detects "plain" if conditions, as well as conditions containing
nested && and || terms.
As generated by IDO and GCC, conditions with && and || terms are emitted in a
very particular way. There will be a "chain" ConditionalNodes, where each node
falls through to the next node in the chain.
Each conditional edge from the nodes in this chain will go to one of:
- The head of the if block body (`if_node`)
- The head of the else block body (`else_node`)
- A *later* conditional node in the chain (no loops)
We know IDO likes to emit the assembly for basic blocks in the same order that
they appear in the C source. So, we generally call the fallthrough of the final
ConditionNode the `if_node` (unless it is empty). By construction, it will be
an earlier node than the `else_node`.
"""
# Find the longest fallthrough chain of ConditionalNodes.
# This is the starting point for finding the complex &&/|| Condition
# The conditional edges will be checked in later step
curr_node: Node = start
chained_cond_nodes: List[ConditionalNode] = []
while True:
assert isinstance(curr_node, ConditionalNode)
chained_cond_nodes.append(curr_node)
curr_node = curr_node.fallthrough_edge
if not (
# If &&/|| detection is disabled, then limit the condition to one node
context.options.andor_detection
# Only include ConditionalNodes
and isinstance(curr_node, ConditionalNode)
# Only include nodes that are postdominated by `end`
and end in curr_node.postdominators
# Exclude the `end` node
and end is not curr_node
# Exclude any loop nodes (except `start`)
and not curr_node.loop
# Exclude nodes with incoming edges that are not part of the condition
and all(p in chained_cond_nodes for p in curr_node.parents)
# Exclude guards for SwitchNodes (they may be elided)
and not is_switch_guard(curr_node)
):
break
# We want to take the largest chain of ConditionalNodes that can be converted to
# a single condition with &&'s and ||'s. We start with the largest chain computed
# above, and then trim it until it meets this criteria. The resulting chain will
# always have at least one node.
while True:
assert chained_cond_nodes
cond_result = try_make_if_condition(chained_cond_nodes, end)
if cond_result:
break
# Shorten the chain by removing the last node, then try again.
chained_cond_nodes.pop()
cond, if_node, else_node = cond_result
# Mark nodes that may have comma expressions in `cond` as emitted
context.emitted_nodes.update(chained_cond_nodes[1:])
# Build the if & else bodies
else_body: Optional[Body] = None
if else_node:
else_body = build_flowgraph_between(context, else_node, end)
if_body = build_flowgraph_between(context, if_node, end)
return IfElseStatement(cond, if_body, else_body)
def join_conditions(left: Condition, op: str, right: Condition) -> Condition:
assert op in ["&&", "||"]
return BinaryOp(left, op, right, type=Type.bool())
def emit_return(context: Context, node: ReturnNode, body: Body) -> None:
ret_info = get_block_info(node)
ret = ret_info.return_value
if ret is not None:
ret_str = format_expr(ret, context.fmt)
body.add_statement(SimpleStatement(f"return {ret_str};", is_jump=True))
context.is_void = False
else:
body.add_statement(SimpleStatement("return;", is_jump=True))
def build_switch_between(
context: Context,
switch: SwitchNode,
default: Optional[Node],
end: Node,
) -> SwitchStatement:
"""
Output the subgraph between `switch` and `end`, but not including `end`.
The returned SwitchStatement starts with the jump to the switch's value.
"""
switch_cases = switch.cases[:]
if default is end:
default = None
elif default is not None:
switch_cases.append(default)
switch_index = add_labels_for_switch(context, switch, default)
jump = get_block_info(switch).switch_control
assert jump is not None
switch_body = Body(print_node_comment=context.options.debug)
# Order case blocks by their position in the asm, not by their order in the jump table
# (but use the order in the jump table to break ties)
sorted_cases = sorted(
set(switch_cases), key=lambda node: (node.block.index, switch_cases.index(node))
)
next_sorted_cases: List[Optional[Node]] = []
next_sorted_cases.extend(sorted_cases[1:])
next_sorted_cases.append(None)
for case, next_case in zip(sorted_cases, next_sorted_cases):
if case in context.emitted_nodes or case is end:
pass
elif (
next_case is not None
and next_case not in context.emitted_nodes
and next_case is not end
and next_case in case.postdominators
):
switch_body.extend(build_flowgraph_between(context, case, next_case))
if not switch_body.ends_in_jump():
switch_body.add_comment(f"fallthrough")
else:
switch_body.extend(build_flowgraph_between(context, case, end))
if not switch_body.ends_in_jump():
switch_body.add_statement(SimpleStatement("break;", is_jump=True))
return SwitchStatement(jump, switch_body, switch_index)
def detect_loop(context: Context, start: Node, end: Node) -> Optional[DoWhileLoop]:
assert start.loop
# Find the the condition for the do-while, if it exists
condition: Optional[Condition] = None
for node in start.loop.backedges:
if (
node in start.postdominators
and isinstance(node, ConditionalNode)
and node.fallthrough_edge == end
):
block_info = get_block_info(node)
assert block_info.branch_condition is not None
condition = block_info.branch_condition
new_end = node
break
if not condition:
return None
loop_body = build_flowgraph_between(
context,
start,
new_end,
skip_loop_detection=True,
)
emit_node(context, new_end, loop_body)
return DoWhileLoop(loop_body, condition)
def build_flowgraph_between(
context: Context, start: Node, end: Node, skip_loop_detection: bool = False
) -> Body:
"""
Output a section of a flow graph that has already been translated to our
symbolic AST. All nodes between start and end, including start but NOT end,
will be printed out using if-else statements and block info.
`skip_loop_detection` is used to prevent infinite recursion, since (in the
case of loops) this function can be recursively called by itself (via
`detect_loop`) with the same `start` argument.
"""
curr_start: Node = start
body = Body(print_node_comment=context.options.debug)
# We will split this graph into subgraphs, where the entrance and exit nodes
# of that subgraph are at the same indentation level. "curr_start" will
# iterate through these nodes by taking the immediate postdominators,
# which are commonly referred to as articulation nodes.
while curr_start != end:
assert not isinstance(curr_start, TerminalNode)
if (
not skip_loop_detection
and curr_start.loop
and not curr_start in context.emitted_nodes
):
# Find the immediate postdominator to the whole loop,
# i.e. the first node outside the loop body
imm_pdom: Node = curr_start
while imm_pdom in curr_start.loop.nodes:
assert imm_pdom.immediate_postdominator is not None
imm_pdom = imm_pdom.immediate_postdominator
# Construct the do-while loop
do_while_loop = detect_loop(context, curr_start, imm_pdom)
if do_while_loop:
body.add_do_while_loop(do_while_loop)
# Move on.
curr_start = imm_pdom
continue
# Write the current node, or a goto, to the body
if not emit_node(context, curr_start, body):
# If the node was already witten, emit_node will use a goto
# and return False. After the jump, there control flow will
# continue from there (hopefully hitting `end`!)
break
if curr_start.emit_goto:
# If we have decided to emit a goto here, then we should just fall
# through to the next node by index, after writing a goto.
emit_goto(context, curr_start, body)
# Advance to the next node in block order. This may skip over
# unreachable blocks -- hopefully none too important.
index = context.flow_graph.nodes.index(curr_start)
fallthrough = context.flow_graph.nodes[index + 1]
if isinstance(curr_start, ConditionalNode):
assert fallthrough == curr_start.fallthrough_edge
curr_start = fallthrough
continue
# The interval to process is [curr_start, curr_start.immediate_postdominator)
curr_end = curr_start.immediate_postdominator
assert curr_end is not None
# For nodes with branches, curr_end is not a direct successor of curr_start
if is_switch_guard(curr_start):
# curr_start is a ConditionalNode that falls through to a SwitchNode,
# where the condition checks that the switch's control expression is
# within the jump table bounds.
# We can combine the if+switch into just a single switch block.
assert isinstance(curr_start, ConditionalNode), "checked by is_switch_guard"
switch_node = curr_start.fallthrough_edge
assert isinstance(switch_node, SwitchNode), "checked by is_switch_guard"
default_node = curr_start.conditional_edge
# is_switch_guard checked that switch_node has no statements to write,
# so it is OK to mark it as emitted
context.emitted_nodes.add(switch_node)
if curr_end is switch_node:
curr_end = switch_node.immediate_postdominator
assert curr_end in curr_start.postdominators
body.add_switch(
build_switch_between(context, switch_node, default_node, curr_end)
)
elif isinstance(curr_start, SwitchNode):
body.add_switch(build_switch_between(context, curr_start, None, curr_end))
elif isinstance(curr_start, ConditionalNode):
body.add_if_else(build_conditional_subgraph(context, curr_start, curr_end))
elif (
isinstance(curr_start, BasicNode) and curr_start.fake_successor == curr_end
):
curr_end = curr_start.successor
else:
# No branch, but double check that we didn't skip any nodes.
# If the check fails, then the immediate_postdominator computation was wrong
assert curr_start.children() == [curr_end], (
f"While emitting flowgraph between {start.name()}:{end.name()}, "
f"skipped nodes while stepping from {curr_start.name()} to {curr_end.name()}."
)
# Move on.
curr_start = curr_end
return body
def build_naive(context: Context, nodes: List[Node]) -> Body:
"""Naive procedure for generating output with only gotos for control flow.
Used for --no-ifs, when the regular if_statements code fails."""
body = Body(print_node_comment=context.options.debug)
def emit_goto_or_early_return(node: Node, body: Body) -> None:
if isinstance(node, ReturnNode) and not node.is_real():
emit_node(context, node, body)
else:
emit_goto(context, node, body)
def emit_successor(node: Node, cur_index: int) -> None:
if (
cur_index + 1 < len(nodes)
and nodes[cur_index + 1] == node
and not (isinstance(node, ReturnNode) and not node.is_real())
):
# Fallthrough is fine
return
emit_goto_or_early_return(node, body)
for i, node in enumerate(nodes):
if isinstance(node, ReturnNode):
# Do not emit duplicated (non-real) return nodes; they don't have
# a well-defined position, so we emit them next to where they are
# jumped to instead.
if node.is_real():
emit_node(context, node, body)
elif isinstance(node, BasicNode):
emit_node(context, node, body)
emit_successor(node.successor, i)
elif isinstance(node, SwitchNode):
index = add_labels_for_switch(context, node, None)
emit_node(context, node, body)
jump = get_block_info(node).switch_control
assert jump is not None
body.add_switch(
SwitchStatement(
jump=jump,
body=Body(print_node_comment=False),
index=index,
)
)
elif isinstance(node, ConditionalNode):
emit_node(context, node, body)
if_body = Body(print_node_comment=True)
emit_goto_or_early_return(node.conditional_edge, if_body)
block_info = get_block_info(node)
assert block_info.branch_condition is not None
body.add_if_else(
IfElseStatement(
block_info.branch_condition,
if_body=if_body,
else_body=None,
)
)
emit_successor(node.fallthrough_edge, i)
else:
assert isinstance(node, TerminalNode)
return body
def build_body(context: Context, options: Options) -> Body:
start_node: Node = context.flow_graph.entry_node()
terminal_node: Node = context.flow_graph.terminal_node()
is_reducible = context.flow_graph.is_reducible()
if options.debug:
print("Here's the whole function!\n")
# Label switch nodes
switch_nodes = [n for n in context.flow_graph.nodes if isinstance(n, SwitchNode)]
if len(switch_nodes) == 1:
# There is only one switch in this function (no need to label)
context.switch_nodes[switch_nodes[0]] = 0
else:
for i, switch_node in enumerate(switch_nodes):
context.switch_nodes[switch_node] = i + 1
body: Body
if options.ifs and is_reducible:
body = build_flowgraph_between(context, start_node, terminal_node)
body.elide_empty_returns()
else:
body = Body(print_node_comment=context.options.debug)
if options.ifs and not is_reducible:
body.add_comment(
"Flowgraph is not reducible, falling back to gotos-only mode."
)
body.extend(build_naive(context, context.flow_graph.nodes))
# Check no nodes were skipped: build_flowgraph_between should hit every node in
# well-formed (reducible) graphs; and build_naive explicitly emits every node
unemitted_nodes = (
set(context.flow_graph.nodes)
- context.emitted_nodes
- {context.flow_graph.terminal_node()}
)
for node in unemitted_nodes:
if isinstance(node, ReturnNode) and not node.is_real():
continue
body.add_comment(
f"bug: did not emit code for node #{node.name()}; contents below:"
)
emit_node(context, node, body)
return body
def get_function_text(function_info: FunctionInfo, options: Options) -> str:
fmt = options.formatter()
context = Context(flow_graph=function_info.flow_graph, options=options, fmt=fmt)
body: Body = build_body(context, options)
function_lines: List[str] = []
fn_name = function_info.stack_info.function.name
arg_strs = []
for i, arg in enumerate(function_info.stack_info.arguments):
if i == 0 and function_info.stack_info.replace_first_arg is not None:
original_name, original_type = function_info.stack_info.replace_first_arg
arg_strs.append(original_type.to_decl(original_name, fmt))
else:
arg_strs.append(arg.type.to_decl(arg.format(fmt), fmt))
if function_info.stack_info.is_variadic:
arg_strs.append("...")
arg_str = ", ".join(arg_strs) or "void"
fn_header = f"{fn_name}({arg_str})"
if context.is_void:
fn_header = f"void {fn_header}"
else:
fn_header = function_info.return_type.to_decl(fn_header, fmt)
whitespace = "\n" if fmt.coding_style.newline_after_function else " "
function_lines.append(f"{fn_header}{whitespace}{{")
any_decl = False
with fmt.indented():
local_vars = function_info.stack_info.local_vars
# GCC's stack is ordered low-to-high (e.g. `int sp10; int sp14;`)
# IDO's stack is ordered high-to-low (e.g. `int sp14; int sp10;`)
if options.compiler == Options.CompilerEnum.IDO:
local_vars = local_vars[::-1]
for local_var in local_vars:
type_decl = local_var.toplevel_decl(fmt)
if type_decl is not None:
function_lines.append(SimpleStatement(f"{type_decl};").format(fmt))
any_decl = True
# With reused temps (no longer used), we can get duplicate declarations,
# hence the use of a set here.
temp_decls = set()
for temp_var in function_info.stack_info.temp_vars:
if temp_var.need_decl():
expr = temp_var.expr
type_decl = expr.type.to_decl(expr.var.format(fmt), fmt)
temp_decls.add(f"{type_decl};")
any_decl = True
for decl in sorted(temp_decls):
function_lines.append(SimpleStatement(decl).format(fmt))
for phi_var in function_info.stack_info.phi_vars:
type_decl = phi_var.type.to_decl(phi_var.get_var_name(), fmt)
function_lines.append(SimpleStatement(f"{type_decl};").format(fmt))
any_decl = True
for reg_var in function_info.stack_info.reg_vars.values():
if reg_var.reg not in function_info.stack_info.used_reg_vars:
continue
type_decl = reg_var.type.to_decl(reg_var.format(fmt), fmt)
function_lines.append(SimpleStatement(f"{type_decl};").format(fmt))
any_decl = True
# Create a variable to cast the original first argument to the assumed type
if function_info.stack_info.replace_first_arg is not None:
assert len(function_info.stack_info.arguments) >= 1
replaced_arg = function_info.stack_info.arguments[0]
original_name, original_type = function_info.stack_info.replace_first_arg
lhs = replaced_arg.type.to_decl(replaced_arg.format(fmt), fmt)
rhs = f"({replaced_arg.type.format(fmt)}) {original_name}"
function_lines.append(SimpleStatement(f"{lhs} = {rhs};").format(fmt))
if any_decl:
function_lines.append("")
function_lines.append(body.format(fmt))
function_lines.append("}")
full_function_text: str = "\n".join(function_lines)
return full_function_text
|
<?php
namespace Admin\Service;
use Common\Service\BaseService;
class RuleService extends BaseService{
public function ruleindex( $param=[] ){
$ruleObj = $this->getModel('rule');
$rule_init = $ruleObj->getrule();
$rule = $ruleObj->getTreeData( $rule_init , 'level','id','title');
$rule = $this->removekey($rule);
unset($ruleObj);
return tp_return( 0 , 'ok' , $rule );
}
public function removekey( $arr ){
$newarr = [];
foreach ($arr as $key => $value) {
$current = $value;
$current['tags'][] = $value['name'];
if(!empty($arr[$key]['nodes'])){
$current['nodes'] = $this->removekey($arr[$key]['nodes']);
}else{
unset($current['nodes']);
}
$newarr[] = $current;
}
return $newarr;
}
public function editrule( $param=[] ){
$data = [
'title' => $param['rule_name'],
'name' => $param['rule']
];
$ruleObj = $this->getModel('rule');
if( $param['id']==="" ){
$data['pid'] = $param['pid'];
$bool = $ruleObj->addrule( $data );
}else{
$map['id'] = $param['id'];
$bool = $ruleObj->editrule( $map , $data );
}
unset($ruleObj);
return tp_return( 0 , '操作成功' , $bool );
}
public function delrule( $param=[] ){
$map = [
'id' => $param['id'],
];
$ruleObj = $this->getModel('rule');
$child = $ruleObj->childrule( $map );
if( $child!=0 ){
unset($ruleObj);
return tp_return( -2 , '请先删除子权限' , $child );
}
$bool = $ruleObj->deleterule( $map );
unset($ruleObj);
return tp_return( 0 , '操作成功' , $bool );
}
}
|
import { Component, OnInit, Output, Input, EventEmitter } from '@angular/core';
import { NzMessageService } from 'ng-zorro-antd';
import * as xolor from 'xolor';
import ColorInterface from '../store/color.interface';
import { ClipboardService } from '../clipboard.service';
import { Store, select } from '@ngrx/store';
import { AddColor } from '../store/color.actions';
import { Observable } from 'rxjs';
import { map, first } from 'rxjs/operators';
const formatColor = (num) => `00${num.toString(16)}`.slice(-2);
@Component({
selector: 'playground-color-square',
templateUrl: './color-square.component.html',
styleUrls: ['./color-square.component.scss']
})
export class ColorSquareComponent implements OnInit {
copies = 0;
data$: Observable<ColorInterface[]>;
colorExists$: Observable<Boolean>;
@Input() r = 0;
@Input() g = 0;
@Input() b = 0;
constructor(
private nzMessage: NzMessageService,
private clipboard: ClipboardService,
private store: Store<{ colors: Array<ColorInterface> }>) {
this.data$ = this.store
.pipe(select('colors'));
this.colorExists$ = this.data$
.pipe(map((value: ColorInterface[]) => value.every(x => x.hex !== this.hex)))
.pipe(first());
this.data$
.subscribe({
next: (data) => this.copies = data.length,
})
}
copyColor($event: Event) {
const hex = this.rgb;
const { r, g, b } = this;
this.colorExists$
.subscribe({
next: (exists) => {
if(exists) {
this.clipboard.copy(hex);
this.store.dispatch(new AddColor({ r, g, b, hex }));
} else {
this.nzMessage.error('Color exists');
}
}
})
}
get hex() { return this.rgb; }
get rgb() {
return `#${formatColor(this.r)}${formatColor(this.g)}${formatColor(this.b)}`;
}
get myStyle() {
const xolorObject = xolor(this.rgb)
const comp = xolorObject.comp().hex;
const inver = xolorObject.inverse().hex;
return {
background: this.rgb,
border: `1px solid ${comp}`,
color: inver,
'text-shadow': `${comp} 1px 1px 10px`,
}
}
ngOnInit() {
}
}
|
CREATE TABLE B_SALE_HDALE
(
CODE VARCHAR(100) NOT NULL,
ID INT NOT NULL,
NAME VARCHAR(100) NOT NULL,
PCITY VARCHAR(100) NULL,
PSUBREGION VARCHAR(100) NULL,
PREGION VARCHAR(100) NULL,
PCOUNTRY VARCHAR(100) NULL,
LOCATION_ID INT NULL,
LOCATION_EXT_ID INT NULL
)
GO
ALTER TABLE B_SALE_HDALE ADD CONSTRAINT PK_B_SALE_HDALE PRIMARY KEY (CODE)
GO
CREATE INDEX IX_BSHDALE_LOCATION_ID ON B_SALE_HDALE (LOCATION_ID)
GO
|
use crate::query_builder::Only;
use crate::Table;
/// The `only` method
///
/// This is only implemented for the Postgres backend.
/// The `ONLY` clause is used to select only from one table and not any inherited ones.
///
/// Calling this function on a table (`mytable.only()`) will result in the SQL `ONLY mytable`.
/// `mytable.only()` can be used just like any table in diesel since it implements
/// [Table](crate::Table).
///
/// Example:
///
/// ```rust
/// # include!("../../../doctest_setup.rs");
/// # use schema::{posts, users};
/// # use diesel::dsl::*;
/// # fn main() {
/// # let connection = &mut establish_connection();
/// let n_sers_in_main_table = users::table
/// .only()
/// .select(count(users::id))
/// .first::<i64>(connection);
/// # }
/// ```
/// Selects the number of entries in the `users` table excluding any rows found in inherited
/// tables.
///
/// It can also be used in inner joins:
///
/// ```rust
/// # include!("../../../doctest_setup.rs");
/// # use schema::{posts, users};
/// # use diesel::dsl::*;
/// # fn main() {
/// # let connection = &mut establish_connection();
/// # let _ =
/// users::table
/// .inner_join(posts::table.only())
/// .select((users::name, posts::title))
/// .load::<(String, String)>(connection);
/// # }
/// ```
/// That query excludes any posts that reside in any inherited table.
///
pub trait OnlyDsl: Table {
/// See the trait-level docs.
fn only(self) -> Only<Self> {
Only { source: self }
}
}
impl<T: Table> OnlyDsl for T {}
|
package com_app_wx.app.contact;
import com_app_wx.app.AppMainPage;
import org.junit.jupiter.api.*;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
import java.net.MalformedURLException;
import java.util.stream.Stream;
import static org.junit.jupiter.api.Assertions.assertEquals;
/**
* @param
* @Auther: zhangcheng
* @Date: 2020/11/27 16:44
* @Description: 企业微信测试
*/
@TestMethodOrder(MethodOrderer.OrderAnnotation.class) //使测试的方法顺序执行,需要与@TestMethodOrder结合使用,示例如下:
public class AppContactPageTest {
private static AppMainPage mainPage;
public AppContactPageTest() {
}
@BeforeAll
static void beforeAll() throws MalformedURLException {
// 数据清理 通过借口
mainPage = new AppMainPage();
}
@BeforeEach
void beforeEach(){//进入入口
}
@AfterEach
void afterEach(){//退到入口
}
/**
* @MethodSource 允许您引用测试类或外部类的一个或多个工厂方法。
* 此类工厂方法必须返回流、可迭代、迭代器或参数数组。此外,这种工厂方法不能接受任何参数。
* 测试类中的工厂方法必须是静态的,除非用@TestInstance(Lifecycle.PER_CLASS)注释测试类;
* 然而,外部类中的工厂方法必须始终是静态的。
*
* 如果只需要一个参数,可以返回参数类型实例的 Stream(流),如下面的示例所示
* @return
*/
public static Stream<Arguments> data(){
return Stream.of(
Arguments.arguments("zhangssi","13262553523"),
Arguments.arguments("zhangwud","13262553567"));
}
@MethodSource("data")
@ParameterizedTest
@Order(1)
public void testAddMember(String userName, String userPhone) throws InterruptedException {
AppContactPage appContactPage = mainPage.contact().addMember(userName, userPhone);
assertEquals(appContactPage.getToast(), "添加成功");
}
}
|
<?php
/************************************************************************
* This script combines functions from the Inbound Shipment API
* with Klasrun-specific functions to create a shipment to be sent to Amazon.
************************************************************************/
/*************************************************************
* Call PutTransportContent to add dimensions to all items in
* each shipment.
*************************************************************/
function spPutTransport($member, $key, $service) {
// Create array of dimensions in shipment
$shipmentDimensions = array();
foreach ($member as $value) {
$shipmentDimensions[] = $value;
}
// Enter parameters to be passed into PutTransportContent
$parameters = array (
'SellerId' => MERCHANT_ID,
'ShipmentId' => $shipmentId,
'IsPartnered' => 'true',
'ShipmentType' => 'SP',
'TransportDetails' => array(
'PartneredSmallParcelData' => array(
'CarrierName' => 'UNITED_PARCEL_SERVICE_INC',
'PackageList' => array( 'member' => $shipmentDimensions)
)
)
);
// Send dimensions to Amazon
$requestPut = new FBAInboundServiceMWS_Model_PutTransportContentRequest($parameters);
unset($parameters);
$xmlPut = invokePutTransportContent($service, $requestPut);
}
|
module UniversumBoolType where
import Prelude (Bool)
import Universum hiding (Bool, show)
foo :: Bool
foo = undefined
|
<?php
// Muaz Khan - www.MuazKhan.com
// MIT License - https://www.webrtc-experiment.com/licence/
// Documentation - https://github.com/muaz-khan/WebRTC-Experiment/tree/master/RecordRTC
foreach(array('video/webm', 'video', 'audio') as $type) {
$file = fopen("test.txt","w");
fwrite($file,sizeof($_FILES)."llmlmpokpok");
//fwrite($file,$_FILES["${type}-blob"]["size"][0]."pjjujujujuj");
fclose($file);
if (isset($_FILES["${type}-blob"])) {
echo 'uploads/';
$fileName = $_POST["${type}-filename"];
$nm=time().".webm";
$uploadDirectory = '/var/www/owncloud/data/operator/files/Camera/'.$fileName;
if (!move_uploaded_file($_FILES["${type}-blob"]["tmp_name"], $uploadDirectory)) {
echo(" problem moving uploaded file");
}
echo($fileName);
}
}
?>
|
-- db_patches
INSERT INTO `db_patches` (issue, created) VALUES ('POCOR-3058', NOW());
-- code here
UPDATE `area_administratives` SET `parent_id` = NULL WHERE `parent_id` = -1;
UPDATE `areas` SET `parent_id` = NULL WHERE `parent_id` = -1;
|
<nav>
<div class="nav-wrapper black darken-4">
<a href="#" data-activates="mobile-demo" class="button-collapse">
<i class="material-icons">menu</i>
</a>
<!-- Mobile View -->
<ul class="side-nav" id="mobile-demo">
@if(Auth::guest())
<li><a href="{{ route('login') }}">Login</a></li>
<li><a href="{{ route('register') }}">Register</a></li>
@else
<li><a href='pages/shared'>Shared</a></li>
<li><a href='documents'>Documents</a></li>
<li><a href="mydocuments">My Documents</a></li>
<li><a href='categories'>Categories</a></li>
@hasanyrole('Root|Admin')
<li><a href='users'>Users</a></li>
<li><a href='departments'>Departments</a></li>
<li><a href='logs'>Logs</a></li>
@hasrole('Root')
<li><a href='pages/backup'>Backup</a></li>
@endhasrole
@endhasanyrole
<li class="divider"></li>
<li><a href='profile'>My Account</a></li>
<li>
<a href="{{ route('logout') }}"
onclick="event.preventDefault();
document.getElementById('logout-form').submit();">
Logout
</a>
<form id="logout-form" action="{{ route('logout') }}" method="POST" style="display: none;">
{{ csrf_field() }}
</form>
</li>
@endif
</ul>
<!-- Desktop View -->
<ul class="right hide-on-med-and-down">
<!-- Authentication Links -->
@if (Auth::guest())
<li><a href="{{ route('login') }}">Login</a></li>
<li><a href="{{ route('register') }}">Register</a></li>
@else
<!-- Dropdown Trigger -->
<li>
<a href="" class="datepicker"><i class="material-icons">date_range</i></a>
</li>
<li>
@if($trashfull > 0)
<a href='/trash'><i class="material-icons red-text">delete</i></a>
@else
<a href='/trash'><i class="material-icons">delete</i></a>
@endif
</li>
@hasanyrole('Root|Admin')
<li>
<a href='requests'>Requests<span class="new badge white-text">{{ $requests }}</span></a>
</li>
@endhasanyrole
<li>
<a class="dropdown-button" href="#!" data-activates="dropdown1">{{ Auth::user()->name }}
<i class="material-icons right">arrow_drop_down</i>
</a>
</li>
@endif
</ul>
</div>
</nav>
<!-- Dropdown Structure -->
<ul id="dropdown1" class="dropdown-content">
<li><a href='profile'>My Account</a></li>
<li><a href='mydocuments'>My Documents</a></li>
<li>
<a href="{{ route('logout') }}"
onclick="event.preventDefault();
document.getElementById('logout-form').submit();">
Logout
</a>
<form id="logout-form" action="{{ route('logout') }}" method="POST" style="display: none;">
{{ csrf_field() }}
</form>
</li>
</ul>
|
require "base64"
require "stringio"
require "socket"
module Triton
module Http
module Signing
# This class implements a minimum amount of ssh-agent protocol required
# to have the agent sign payloads on our behalf.
# At present the class only supports signing using rsa keys, dsa and ecdsa have not
# been implemented. However adding this support should not be too difficult
class Agent
# Messages identifiers
SSH2_AGENTC_SIGN_REQUEST = 13
SSH2_AGENT_SIGN_RESPONSE = 14
SSH_AGENT_FAILURE = 5
# This method signs any data passed in using the ssh-agent and returns
# the binary signature. For rsa keys the signature will be rsa encrypted sha1
#
# [data] data to sign
# [pub_key_path] Path to an ssh public key, the agent will use the corresponding private key
#
# Returns an Array containing the signature blob and the ssh signature type
#
def sign(data, pub_key_path)
# Packet format for ssh agent signing request
# byte SSH2_AGENTC_SIGN_REQUEST
# ssh_string key_blob
# ssh_string data
# uint32 flags
ssh_agent do |c|
packet = [SSH2_AGENTC_SIGN_REQUEST].pack("C") +
ssh_string(load_pub_key_data(pub_key_path)) +
ssh_string(data) +
[0].pack("N")
packet = [packet.bytesize].pack("N") + packet
c.puts(packet)
# Process the response and extract the signature
# from the ssh formatted blob were been naive here and just
# supporting ssh-rsa for now.
response = recv_ssh2_agentc_sign_resp(c)
if response =~ /ssh-rsa/
return response.split("ssh-rsa")[1].byteslice(4..-1), "ssh-rsa"
else
raise AgentException, "Unsupported signature type returned by ssh-agent, only ssh-rsa supported"
end
end
end
# opens the ssh-agent socket and yeilds
# we memorise the open socket to avoid opening it for every
# request
def ssh_agent
if ENV['SSH_AUTH_SOCK'] == nil
raise AgentException, "Can't find ssh agent socket no SSH_AUTH_SOCK env variable set"
end
@ssh_agent_sock ||= UNIXSocket.open(ENV['SSH_AUTH_SOCK'])
yield @ssh_agent_sock
end
# Returns an ssh string as defined in RFC 4251
def ssh_string(data)
return [data.bytesize].pack("N") + data
end
# Decode a string from an ssh-agent packet defined in RFC 4251
def decode_ssh_string(data)
str_len = data.read(4).unpack("N").first()
if str_len < 1
raise AgentException, "Invalid short string in packet"
end
data.read(str_len)
end
# Loads data for an ssh public key, returns it
# base64 encoded minus the type header
def load_pub_key_data(path)
File.open(path) do |f|
return Base64.decode64(f.read.split(' ')[1])
end
end
# Process a ssh2 signing request response
def recv_ssh2_agentc_sign_resp(socket)
msg = StringIO.new(recv_packet(socket))
msg_type = msg.read(1).unpack("C").first
if msg_type == SSH2_AGENT_SIGN_RESPONSE
signature_blob = decode_ssh_string(msg)
elsif msg_type == SSH_AGENT_FAILURE
raise AgentException, "Error response from ssh agent received"
else
raise AgentException, "Unexpected response from ssh agent received"
end
return signature_blob
end
# Read a packet off the wire
def recv_packet(socket)
# read the size and then the rest of the packet
msg_size = socket.read(4).unpack("N").first
if msg_size > 1
packet = socket.read(msg_size)
else
raise AgentException, "Invalid short packet received from ssh-agent"
end
return packet
end
private :recv_ssh2_agentc_sign_resp, :recv_packet
end
end
end
end
|
#!/usr/bin/env ruby -w
class RandomBot
def reset
@cards = (1..13).sort_by { rand }
end
alias_method :initialize, :reset
def play_card
@cards.shift
end
def play
until @cards.empty?
$stdin.gets # competition card--ignored
$stdout.puts play_card
$stdout.flush
$stdin.gets # opponent's bid--ignored
end
end
end
if __FILE__ == $PROGRAM_NAME
RandomBot.new.play
end
|
package br.com.zup.pix.consulta
import br.com.zup.ConsultaChavePixResponse
import br.com.zup.pix.ChavePix
import br.com.zup.pix.ContaAssociada
import br.com.zup.pix.TipoChave
import br.com.zup.pix.TipoConta
import com.google.protobuf.Timestamp
import java.time.LocalDateTime
import java.time.ZoneId
import java.util.*
data class ChavePixInfo(
val pixId: UUID? = null,
val clienteId: UUID? = null,
val tipoChave: TipoChave,
val chave: String,
val tipoConta: TipoConta,
val conta: ContaChavePixInfo,
val registradaEm: LocalDateTime = LocalDateTime.now()
) {
companion object {
fun of(chavePix: ChavePix): ChavePixInfo {
return ChavePixInfo(
pixId = chavePix.id,
clienteId = chavePix.clienteId,
tipoChave = chavePix.tipoChave,
chave = chavePix.chave,
tipoConta = chavePix.tipoConta,
conta = ContaChavePixInfo.of(chavePix.conta),
registradaEm = chavePix.criadaEm
)
}
}
fun toConsultaChavePixResponse(): ConsultaChavePixResponse {
return ConsultaChavePixResponse.newBuilder()
.setClienteId(this.clienteId?.toString() ?: "")
.setPixId(this.pixId?.toString() ?: "")
.setChave(
ConsultaChavePixResponse.ChavePix.newBuilder()
.setTipo(br.com.zup.TipoChave.valueOf(this.tipoChave.name))
.setChave(this.chave)
.setConta(
ConsultaChavePixResponse.ChavePix.ContaInfo.newBuilder()
.setTipo(br.com.zup.TipoConta.valueOf(this.conta.tipo))
.setInstituicao(this.conta.nomeInstituicao)
.setNomeDoTitular(this.conta.nomeTitular)
.setCpfDoTitular(this.conta.cpfTitular)
.setAgencia(this.conta.agencia)
.setNumeroDaConta(this.conta.numero)
.build()
)
.setCriadaEm(
this.registradaEm.let {
val createdAt = it.atZone(ZoneId.of("UTC")).toInstant()
Timestamp.newBuilder()
.setSeconds(createdAt.epochSecond)
.setNanos(createdAt.nano)
.build()
}
)
.build()
)
.build()
}
data class ContaChavePixInfo(
val tipo: String,
val nomeInstituicao: String,
val nomeTitular: String,
val cpfTitular: String,
val agencia: String,
val numero: String,
) {
companion object {
fun of(contaAssociada: ContaAssociada): ContaChavePixInfo {
return ContaChavePixInfo(
tipo = contaAssociada.tipo,
nomeInstituicao = contaAssociada.instituicao.nomeInstituicao,
nomeTitular = contaAssociada.titular.nomeTitular,
cpfTitular = contaAssociada.titular.cpf,
agencia = contaAssociada.agencia,
numero = contaAssociada.numero
)
}
}
}
}
|
using GalaSoft.MvvmLight.Messaging;
using gestadh45.model;
namespace gestadh45.business.PersonalizedMsg
{
/// <summary>
/// Notification message demandant l'affichage des infos sur la saison courante.
/// </summary>
public class NMShowInfosSaisonCourante : NotificationMessage<Saison>
{
/// <summary>
/// Initialise une nouvelle instance de NMShowInfosSaisonCourante.
/// </summary>
/// <param name="saison">Infos à afficher.</param>
public NMShowInfosSaisonCourante(Saison saison) : base(saison, NMType.NMShowInfosSaisonCourante) {
}
}
}
|
package com.thilawfabrice.compass.ui.displaytips
import android.annotation.SuppressLint
import android.os.Bundle
import android.os.Handler
import android.os.Looper
import android.view.View
import android.widget.FrameLayout
import android.widget.ImageView
import android.widget.LinearLayout
import android.widget.TextView
import androidx.appcompat.app.AppCompatActivity
import com.google.android.material.button.MaterialButton
import com.google.gson.Gson
import com.thilawfabrice.compass.R
import com.thilawfabrice.compass.domain.entities.TipForRemoteWork
import kotlinx.android.synthetic.main.activity_tip_details.*
/**
* An example full-screen activity that shows and hides the system UI (i.e.
* status bar and navigation/system bar) with user interaction.
*/
class TipDetailsActivity : AppCompatActivity(), View.OnClickListener {
private var tip: TipForRemoteWork? = null
private lateinit var fullscreenContentControls: LinearLayout
private val hideHandler = Handler(Looper.getMainLooper())
@SuppressLint("InlinedApi")
private val hidePart2Runnable = Runnable {
// Delayed removal of status and navigation bar
// Note that some of these constants are new as of API 16 (Jelly Bean)
// and API 19 (KitKat). It is safe to use them, as they are inlined
// at compile-time and do nothing on earlier devices.
mainContent.systemUiVisibility =
View.SYSTEM_UI_FLAG_LOW_PROFILE or
View.SYSTEM_UI_FLAG_FULLSCREEN or
View.SYSTEM_UI_FLAG_LAYOUT_STABLE or
View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY or
View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION or
View.SYSTEM_UI_FLAG_HIDE_NAVIGATION
}
private val hideRunnable = Runnable { hide() }
@SuppressLint("ClickableViewAccessibility")
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_tip_details)
supportActionBar?.setDisplayHomeAsUpEnabled(true)
val view = findViewById<FrameLayout>(R.id.root)
val contentTV: TextView = view.findViewById(R.id.content)
val authorNameTV: TextView = view.findViewById(R.id.tipAuthorName)
val authorRoleTV: TextView = view.findViewById(R.id.tipAuthorRole)
val authorPictureImg: ImageView = view.findViewById(R.id.tipAuthorPicture)
val shareOnTwitterBtn: ImageView = view.findViewById(R.id.shareOnTwitter)
val shareOnFacebookBtn: ImageView = view.findViewById(R.id.shareOnFacebook)
val shareOnLinkedInBtn: ImageView = view.findViewById(R.id.shareOnLinkedIn)
val shareCopyBtn: ImageView = view.findViewById(R.id.shareCopy)
val visitAuthorWebPage: MaterialButton = view.findViewById(R.id.goToWebsite)
shareOnTwitterBtn.setOnClickListener(this)
shareOnLinkedInBtn.setOnClickListener(this)
shareOnFacebookBtn.setOnClickListener(this)
shareCopyBtn.setOnClickListener(this)
visitAuthorWebPage.setOnClickListener(this)
val tipIntentExtra: String? = intent.getStringExtra(TIP_DETAILS_INTENT_EXTRA)
tip = if (tipIntentExtra != null) {
Gson().fromJson(tipIntentExtra, TipForRemoteWork::class.java)
} else null
tip?.let {
with(it) {
contentTV.text = content
authorNameTV.text = author?.name
val company =
"${author?.companyName?.run { if (this.isBlank().not()) "at $this" else "" }}"
val role = "${author?.role} $company"
authorRoleTV.text = role
com.bumptech.glide.Glide.with(view.context)
.load(author?.picture)
.circleCrop()
.into(authorPictureImg)
}
}
}
private fun hide() {
// Hide UI first
supportActionBar?.hide()
hideHandler.postDelayed(hidePart2Runnable, UI_ANIMATION_DELAY.toLong())
}
/**
* Schedules a call to hide() in [delayMillis], canceling any
* previously scheduled calls.
*/
private fun delayedHide(delayMillis: Int) {
hideHandler.removeCallbacks(hideRunnable)
hideHandler.postDelayed(hideRunnable, delayMillis.toLong())
}
companion object {
/**
* Some older devices needs a small delay between UI widget updates
* and a change of the status and navigation bar.
*/
private const val UI_ANIMATION_DELAY = 300
const val TIP_DETAILS_INTENT_EXTRA = "tip"
}
override fun onClick(v: View) {
when (v.id) {
R.id.shareOnTwitter -> {
// listenToBox.onShare(tip, SocialTarget.TWITTER)
}
R.id.shareOnLinkedIn -> {
// listenToBox.onShare(tip, SocialTarget.LINKEDIN)
}
R.id.shareOnFacebook -> {
// listenToBox.onShare(tip, SocialTarget.FACEBOOK)
}
R.id.shareCopy -> {
}
R.id.goToWebsite -> {
}
}
}
}
|
require 'faraday'
require 'fileutils'
require 'support/integration_spec_support'
describe "The pact-mock-service command line interface", mri_only: true do
include Pact::IntegrationTestSupport
before :all do
FileUtils.rm_rf 'tmp'
@pid = nil
@pid = fork do
exec "bundle exec bin/pact-mock-service --port 1234 --log tmp/integration.log --pact-dir tmp/pacts"
end
wait_until_server_started 1234
end
it "starts up and responds with mocked responses" do
response = setup_interaction 1234
expect(response.status).to eq 200
response = invoke_expected_request 1234
expect(response.status).to eq 200
expect(response.body).to eq 'Hello world'
write_pact 1234
expect(response.status).to eq 200
end
it "writes logs to the specified log file" do
expect(File.exist?('tmp/integration.log')).to be true
end
it "writes the pact to the specified directory" do
expect(File.exist?('tmp/pacts/consumer-provider.json')).to be true
end
after :all do
if @pid
Process.kill "INT", @pid
end
end
end
|
module modhorfft
implicit none
contains
subroutine test_fft(u,avgerror)
! test accuracy of fft for a given function
use flags, only: debug
use dim, only: nx, nxpp, nxpl, ny, nypl, nzpl
use mpicom, only: myid, numprocs, comm, ierr
! i/o
real, dimension(nxpp,nypl,nzpl), intent(in) :: u
real, intent(out) :: avgerror
! vars
integer :: i,j,k,ii
real, allocatable, dimension(:,:,:) :: qf, q
allocate(qf(nxpl,ny,nzpl),q(nxpp,nypl,nzpl))
q = u
call horfft(q,qf,-1)
q = 0.0
call horfft(q,qf,1)
avgerror = 0.0
do k=1,nzpl
do j=1,nypl
do i=1,nx
avgerror = avgerror + abs( u(i,j,k)-q(i,j,k) )/(nx*nypl*nzpl)
end do
end do
end do
if (debug>=1 .and. myid==0) then
write(*,*) 'avg error introduced by FFT =', avgerror
end if
if (debug>=3) then
if (myid==0) then
write(*,*) '----------------------------------------------------'
write(*,*) 'u(:,:,1) in test_fft:'
end if
do ii=1,numprocs
if (myid+1 == ii) then
write(*,'(256(G12.4,'',''))') ( (u(1,j,k),i=1,nx),j=1,nypl )
else
call sleep(1)
end if
end do
call MPI_BARRIER(comm,ierr)
endif
deallocate(qf,q)
end subroutine test_fft
subroutine horfft(ain,ainf,is)
! SUBROUTINE THAT PERFORMS 2-D FFT OVER ALL nzpl XY PLANES ON A
! PROCESSOR.
!
! is = -1: PHYSICAL TO SPECTRAL SPACE
! is = 1: SPECTRAL TO PHYSICAL SPACE
!
! THIS SUBROUTINE IS A FFTW VERSION OF "horfft".
!
! PROGRAMMED BY: T.SAKAI 10-16-2012 (MODIFIED FROM ORIGINAL horfft)
use dim, only: nx,nxpp,nypl,nzpl,nxpl,ny
use mpicom
use MODFFTW
! i/o
integer, intent(IN) :: is
real, dimension(nxpp,nypl,nzpl), intent(inout) :: ain
real, dimension(nxpl,ny,nzpl), intent(inout) :: ainf
! Scratch arrays
real, allocatable, dimension(:,:,:) :: aintmp, aintmpf
! Local variables
integer :: k
real :: scalefactor, eps
allocate( aintmp(nxpp,nypl,nzpl), aintmpf(nxpl,ny,nzpl) )
aintmp=0.0
aintmpf=0.0
eps = 1.E-20 ! constant to remove noise coming from FFT
if ( is == -1 ) then
aintmp = ain
!-------------------
!-1-D FORWARD FFTs-|
!-------------------
! Perform 1-D FFT in x-direction (real-to-Complex) for each
! k-plane in processor
do k = 1,nzpl
call dfftw_execute_dft_r2c(PLAN_X_FWD,aintmp(1,1,k),aintmp(1,1,k))
end do
! Global data transposition
call dataflip_xtoy(aintmp,ainf)
! Perform 1-D FFT in y-direction (Complex-to-Complex) for each
! k-plane in processor
do k = 1,nzpl
call dfftw_execute_dft(PLAN_Y_FWD,ainf(1,1,k),ainf(1,1,k))
end do
! Scale the result
scalefactor = 1.0/real(nx*ny)
ainf = ainf*scalefactor
else
aintmpf = ainf
!-------------------
!-1-D INVERSE FFTs-|
!-------------------
! Perform 1-D FFT in y-direction (Complex-to-Complex)
! for each k-plane in processor
do k = 1,nzpl
call dfftw_execute_dft(PLAN_Y_BACKWD,aintmpf(1,1,k),aintmpf(1,1,k))
end do
! Global data transposition
call dataflip_ytox(aintmp,aintmpf)
! Perform 1-D INVERSE FFT in x-direction (Complex-to-real)
! for each k-plane in processor
do k = 1,nzpl
call dfftw_execute_dft_c2r(PLAN_X_BACKWD,aintmp(1,1,k),aintmp(1,1,k))
end do
ain = aintmp
end if
deallocate( aintmp, aintmpf )
end subroutine horfft
subroutine norm(uf,vf,wf,tempf)
use dim
use grid, only: wavx
! inputs/outputs
real, dimension(nxpl,ny,nzpl), intent(inout) :: uf,vf,wf,tempf
! local vars
integer :: nyhp, i,j,k, jp
! nyh = ny/2
nyhp = nyh + 1
! intended only to affect k_x=0
if (wavx(1)==0) then
! remove any imaginary part to the integral (avg) over entire domain
do k = 1,nzp
uf(2,1,k) = 0.0
vf(2,1,k) = 0.0
wf(2,1,k) = 0.0
tempf(2,1,k) = 0.0
enddo
endif
! if this is a 2-D problem, don't normalize
if (ny > 2) then
! intended only to affect k_x=0
if (wavx(1)==0) then
! Normalize by taking average of FFT value for ky and conjugate of -ky
! e.g. 0.5*( q(kx=0,ky,z) + conjg(q(kx=0,-ky,z)) )
do j = 2,nyh
jp = ny + 2 - j
do k = 1,nzp
uf(1,j,k) = .5*( uf(1,j,k) + uf(1,jp,k) )
uf(2,j,k) = .5*( uf(2,j,k) - uf(2,jp,k) )
vf(1,j,k) = .5*( vf(1,j,k) + vf(1,jp,k) )
vf(2,j,k) = .5*( vf(2,j,k) - vf(2,jp,k) )
wf(1,j,k) = .5*( wf(1,j,k) + wf(1,jp,k) )
wf(2,j,k) = .5*( wf(2,j,k) - wf(2,jp,k) )
tempf(1,j,k) = .5*( tempf(1,j,k) + tempf(1,jp,k) )
tempf(2,j,k) = .5*( tempf(2,j,k) - tempf(2,jp,k) )
enddo
enddo
! Copy over in correct order the normalized values into -ky half
! s.t. ky = 0, b, 2b, ...,(nyh-1)b,0,-(nyh-1)b,-(nyh-2)b,...,2b,b
! and q(kx,-ky,z) = conjg(q(kx=0,ky,z))
do j = 2,nyh
jp = ny + 2 - j
do k = 1,nzp
uf(1,jp,k) = uf(1,j,k)
uf(2,jp,k) = -uf(2,j,k)
vf(1,jp,k) = vf(1,j,k)
vf(2,jp,k) = -vf(2,j,k)
wf(1,jp,k) = wf(1,j,k)
wf(2,jp,k) = -wf(2,j,k)
tempf(1,jp,k) = tempf(1,j,k)
tempf(2,jp,k) = -tempf(2,j,k)
enddo
enddo
endif ! (wavx(1)==0)
! remove any value at ky=-nyh*beta which should really be ky=0
do k = 1,nzp
do i = 1,nxpl
uf(i,nyhp,k) = 0.0
vf(i,nyhp,k) = 0.0
wf(i,nyhp,k) = 0.0
tempf(i,nyhp,k) = 0.0
enddo
enddo
endif ! (ny > 2)
end subroutine norm
!**********************************************************
!
!-This file contains subroutines used to flip data across
!-processors on the horizontal plane in such a way that
!-one can transition from domain decomposition in the x-direction
!-to domain decomposition in the y-direction and vice versa.
!
!-Developed by PD: January 2005.
!
!-These subroutines are patterned after the similar ones
!-in Kraig Winters' spectral code (See JAOT article).
!
!-PD-6/16/08--C A R E F U L: This is a new version of the routines
!-developed at Cornell by PD.
!-The data transposition is no longer done with immediate and
!-ready communication modes for the sends & receives. Such an approach
!-did not factor in switch topology and contention issues in the MPI
!-library routines. As a result, on the ARL-MJM cluster with
!-either Open-MPI or Infiniserve-MPI (and not MPICH which was
!-problem free in the past) time-out problems would occur during
!-code execution.
!-
!-Following Kraig Winters' advice we use MPI_ALL_TO_ALL global
!-communication routines to perform the data transposition.
!-
!-NOTE-PD-6/16/08: The local array buffer is practically useless right now.
!-However, I have kept it in the code because dataflip is called by
!-a couple of other routines, namely RILEY_DBK, REGRID & EXTRAPOLATE
!-(in both postprocessor and main solver).
!-This array should eventually be removed !
!***********************************************************
subroutine dataflip_xtoy(x,xf)
!************************************************************
! Subroutine that reorders data from domain decomposition in
! in x-direction to d.d. in y-direction
!************************************************************
use dim, only: nx,nxpp,nypl,nzpl,nxpl,ny
use mpicom
!-Input variable x( , , ): nzpl 2-D slices of data partitioned normal to y-
!-direction and
!-Output variable xf( , ,): nzpl 2-D slices of data partitioned normal to x-direction
real, dimension(nxpp,nypl,nzpl), intent(in) :: x
real, dimension(nxpl,ny,nzpl), intent(out) :: xf
real, dimension(:,:,:,:), allocatable :: xtempin, xtempout
! real, dimension(nxpl,nzpl,nypl,nproch) :: xtempin, xtempout
! integer :: numbytes
! integer :: status_array(MPI_STATUS_SIZE)
! integer :: iproc,vslab
! integer :: istart,jstart,source,dest
integer :: i,j,k
integer :: AllocateStatus
integer :: size_of_block, iglob, jglob, iproc
!-Allocate local arrays
!- SEND BUFFER for MPI_ALLTOALL
!- RECV BUFFER for MPI_ALLTOALL
allocate( xtempin(nxpl,nzpl,nypl,nproch),xtempout(nxpl,nzpl,nypl,nproch),stat = AllocateStatus)
!
if (AllocateStatus /= 0) then
write(*,*) "**Not Enough Memory - DATAFLIP_XTOY**"
end if
! status_array = 0
!-Pack input array into send buffer.
!-Order of fastest incrasing indices: i,k,j,iproc (in send buffer)
!-So that upon reception only a simple index change will only be needed.
!-(i.e. received data will be contiguous in j-direction when stored in memory)
do iproc=1,nproch
do k=1,nzpl
do j=1,nypl
do i=1,nxpl
iglob = (iproc-1)*nxpl + i
xtempin(i,k,j,iproc) = x(iglob,j,k)
end do
end do
end do
end do
!-Size of blocks to be transmitted to other processors
size_of_block = nxpl*nypl*nzpl
!***TAK 5-23-2012: ADD SYNCRONIZATION BEFORE GLOBAL COLLECTIVE
! OPERATION. THIS IS TO ENSURE THE PORTABILITY.
!call MPI_BARRIER(comm,ierr)
!-Perform data transposition
call MPI_ALLTOALL(xtempin,size_of_block,mpi_double_precision,xtempout,size_of_block,mpi_double_precision,comm,ierr)
if (ierr /= 0) then
if (myid == 0) write(*,*) 'MPI_ALLTOALL error in DATAFLIP_XTOY)'
call MPI_FINALIZE(ierr)
end if
!-Now copy the receive buffer from xtempout to the output array
do iproc=1,nproch
do k=1,nzpl
do j=1,nypl
do i=1,nxpl
jglob = (iproc-1)*nypl + j
xf(i,jglob,k) = xtempout(i,k,j,iproc)
end do
end do
end do
end do
!-Add ta barrier to be on the safe side
!call MPI_BARRIER(comm,ierr)
!-De-Allocate local array
deallocate(xtempin,xtempout,stat=AllocateStatus)
if (AllocateStatus /= 0) then
write(*,*) "**Error Deallocating - DATAFLIP_XTOY**"
end if
end subroutine dataflip_xtoy
subroutine dataflip_ytox(x,xf)
!************************************************************
! Subroutine that reorders data from domain decomposition in
! in y-direction to d.d. in x-direction
!************************************************************
use dim, only: nx,nxpp,nypl,nzpl,nxpl,ny
use mpicom
!-Input variable xf( , ,): nzpl 2-D slices of data partitioned normal to x-direction
!-Output variable x( , , ): nzpl 2-D slices of data partitioned normal to y-direction.
real, dimension(nxpp,nypl,nzpl), intent(out) :: x
real, dimension(nxpl,ny,nzpl), intent(in) :: xf
real, dimension(:,:,:,:), allocatable :: xtempin,xtempout
! real, dimension(nypl,nzpl,nxpl,nproch) :: xtempin,xtempout
! integer status_array(MPI_STATUS_SIZE)
! integer :: iproc,vslab
! integer :: istart,jstart,source,dest,numwords,tag
integer :: i,j,k
integer :: AllocateStatus
integer :: size_of_block, iglob, jglob, iproc
!-Allocate local arrays
!- SEND BUFFER for MPI_ALLTOALL
!- RECV BUFFER for MPI_ALLTOALL
allocate( xtempin(nypl,nzpl,nxpl,nproch), xtempout(nypl,nzpl,nxpl,nproch),stat = AllocateStatus)
if (AllocateStatus /= 0) then
write(*,*) "**Not Enough Memory - DATAFLIP_YTOX**"
end if
! status_array = 0
!-Pack input array into send buffer.
!-Order of fastest incrasing indices: j,k,i,iproc (in send buffer)
!-So that upon reception only a simple index change will only be needed.
!-(i.e. received data will be contiguous in i-direction when stored in memory)
!-All processors post nonblocking buffered sends & receives
!-(i.e. each processor goes
!-about its business) and set-up buffers for the sends
do iproc=1,nproch
do k=1,nzpl
do j=1,nypl
do i=1,nxpl
jglob = (iproc-1)*nypl + j
xtempin(j,k,i,iproc) = xf(i,jglob,k)
end do
end do
end do
end do
!-Size of blocks to be transmitted to other processors
size_of_block = nxpl*nypl*nzpl
!***TAK 5-23-2012: ADD SYNCRONIZATION BEFORE GLOBAL COLLECTIVE
! OPERATION. THIS IS TO ENSURE THE PORTABILITY.
!call MPI_BARRIER(comm,ierr)
!-Perform data transposition
call MPI_ALLTOALL(xtempin,size_of_block,mpi_double_precision,xtempout,size_of_block,mpi_double_precision,comm,ierr)
if (ierr /= 0) then
if (myid == 0) write(*,*) 'MPI_ALLTOALL error in DATAFLIP_YTOX'
call MPI_FINALIZE(ierr)
end if
!-Now copy the receive buffer from xtempout to the output array
do iproc=1,nproch
do k=1,nzpl
do j=1,nypl
do i=1,nxpl
iglob = (iproc-1)*nxpl + i
x(iglob,j,k) = xtempout(j,k,i,iproc)
end do
end do
end do
end do
!-Add a barrier to be on the safe side
!call MPI_BARRIER(comm,ierr)
!-De-Allocate local array
deallocate(xtempin,xtempout,stat=AllocateStatus)
if (AllocateStatus /= 0) then
write(*,*) "**Error Deallocating - DATAFLIP_YTOX**"
end if
end subroutine dataflip_ytox
end module modhorfft
|
import 'dart:async';
import 'package:e_szivacs/generated/i18n.dart';
import 'package:flutter/material.dart';
import 'package:flutter_html/flutter_html.dart';
import 'package:html_unescape/html_unescape.dart';
import '../Datas/Homework.dart';
import '../Datas/User.dart';
import '../Dialog/TimeSelectDialog.dart';
import '../GlobalDrawer.dart';
import '../Helpers/HomeworkHelper.dart';
import '../Utils/StringFormatter.dart';
import '../globals.dart' as globals;
void main() {
runApp(new MaterialApp(home: new HomeworkScreen()));
}
class HomeworkScreen extends StatefulWidget {
@override
HomeworkScreenState createState() => new HomeworkScreenState();
}
class HomeworkScreenState extends State<HomeworkScreen> {
List<User> users;
User selectedUser;
bool hasLoaded = true;
bool hasOfflineLoaded = false;
List<Homework> homeworks = new List();
List<Homework> selectedHomework = new List();
@override
void initState() {
super.initState();
initSelectedUser();
_onRefreshOffline();
}
void initSelectedUser() async {
setState(() {
selectedUser = globals.selectedUser;
});
}
void refHomework() {
setState(() {
selectedHomework.clear();
});
for (Homework n in homeworks) {
if (n.owner.id == selectedUser.id) {
setState(() {
selectedHomework.add(n);
});
}
}
}
@override
Widget build(BuildContext context) {
return new WillPopScope(
onWillPop: () {
globals.screen = 0;
Navigator.pushReplacementNamed(context, "/main");
},
child: Scaffold(
drawer: GDrawer(),
appBar: new AppBar(
title: new Text(S.of(context).homeworks),
actions: <Widget>[
new IconButton(
icon: new Icon(Icons.access_time),
onPressed: () {
timeDialog().then((b) {
_onRefreshOffline();
refHomework();
_onRefresh();
refHomework();
});
},
),
],
),
body: new Container(
child: hasOfflineLoaded
? new Column(children: <Widget>[
!hasLoaded
? Container(
child: new LinearProgressIndicator(
value: null,
),
height: 3,
)
: Container(
height: 3,
),
new Expanded(child: new RefreshIndicator(
child: new ListView.builder(
itemBuilder: _itemBuilder,
itemCount: selectedHomework.length,
),
onRefresh: _onRefresh))])
: new Center(child: new CircularProgressIndicator()))));
}
Future<bool> timeDialog() {
return showDialog(
barrierDismissible: true,
context: context,
builder: (BuildContext context) {
return new TimeSelectDialog();
},
) ??
false;
}
Future<Null> homeworksDialog(Homework homework) async {
return showDialog<Null>(
context: context,
barrierDismissible: true, // user must tap button!
builder: (BuildContext context) {
return new AlertDialog(
title: new Text(homework.subject + " " + S.of(context).homework),
content: new SingleChildScrollView(
child: new ListBody(
children: <Widget>[
homework.deadline != null
? new Text(S.of(context).deadline + homework.deadline)
: new Container(),
new Text(S.of(context).subject + homework.subject),
new Text(S.of(context).uploader + homework.uploader),
new Text(S.of(context).upload_time +
homework.uploadDate
.substring(0, 16)
.replaceAll("-", '. ')
.replaceAll("T", ". ")),
new Divider(
height: 4.0,
),
Container(
padding: EdgeInsets.only(top: 10),
),
new Html(data: HtmlUnescape().convert(homework.text)),
],
),
),
actions: <Widget>[
new FlatButton(
child: new Text(S.of(context).ok),
onPressed: () {
Navigator.of(context).pop();
},
),
],
);
},
);
}
Future<Null> _onRefresh() async {
setState(() {
hasLoaded = false;
});
Completer<Null> completer = new Completer<Null>();
homeworks = await HomeworkHelper()
.getHomeworks(globals.idoAdatok[globals.selectedTimeForHomework]);
homeworks
.sort((Homework a, Homework b) => b.uploadDate.compareTo(a.uploadDate));
if (mounted)
setState(() {
refHomework();
hasLoaded = true;
hasOfflineLoaded = true;
completer.complete();
});
return completer.future;
}
Future<Null> _onRefreshOffline() async {
setState(() {
hasOfflineLoaded = false;
});
Completer<Null> completer = new Completer<Null>();
homeworks = await HomeworkHelper().getHomeworksOffline(
globals.idoAdatok[globals.selectedTimeForHomework]);
homeworks
.sort((Homework a, Homework b) => b.uploadDate.compareTo(a.uploadDate));
if (mounted)
setState(() {
refHomework();
hasOfflineLoaded = true;
completer.complete();
});
return completer.future;
}
Widget _itemBuilder(BuildContext context, int index) {
return new Column(
children: <Widget>[
new ListTile(
title: new Text(
selectedHomework[index].uploadDate.substring(0, 10) +
" " +
dateToWeekDay(
DateTime.parse(selectedHomework[index].uploadDate)) +
(selectedHomework[index].subject == null
? ""
: (" - " + selectedHomework[index].subject)),
style: TextStyle(fontSize: 20.0),
),
subtitle: new Html(
data: HtmlUnescape().convert(selectedHomework[index].text)),
isThreeLine: true,
onTap: () {
homeworksDialog(selectedHomework[index]);
},
),
new Divider(
height: 5.0,
),
],
);
}
@override
void dispose() {
// TODO: implement dispose
selectedHomework.clear();
super.dispose();
}
}
|
package service;
import dao.ProductDao;
import model.Product;
import javax.ws.rs.*;
import javax.ws.rs.core.MediaType;
import java.util.List;
/**
* @author xzinoviou
*/
@Path("/products")
public class ProductService {
private ProductDao productDao;
public ProductService(){
super();
productDao = new ProductDao();
}
@GET
@Path("/{id}")
@Produces(MediaType.APPLICATION_JSON)
public Product getProductById(@PathParam("id") String id){
return productDao.getProductById(Integer.parseInt(id));
}
@GET
@Path("/all")
@Produces(MediaType.APPLICATION_JSON)
public List<Product> getAllProducts(){
return productDao.getAllProducts();
}
}
|
package SimulationTest.one.exam6.exam1.test16.oca;
/**
* Java SE 11 Programmer I_1Z0-815
* Paulo Alexander Chirán Portillo
* [email protected]
*/
public class A {
public void print() {
System.out.println("A");
}
}
|
import 'dart:async';
import 'package:bnb_wallet/infrastructures/restful/login/repository_login.dart';
import 'package:bnb_wallet/infrastructures/restful/login/response_my_certificate.dart';
import 'package:bnb_wallet/utils/constans_util.dart';
import 'package:bnb_wallet/utils/dialog_util.dart';
import 'package:bnb_wallet/utils/image_path.dart';
import 'package:bnb_wallet/utils/sp_util.dart';
import 'package:bnb_wallet/utils/toast_util.dart';
import 'package:bnb_wallet/view/my/certification/page_certification.dart';
import 'package:bnb_wallet/view/my/certification/page_country_choice.dart';
import 'package:bnb_wallet/view/my/certification/widgets/dialog_widget.dart';
import 'package:flutter/cupertino.dart';
import 'package:flutter/material.dart';
abstract class CertificationState extends State<CertificationPage> {
///真实姓名文本框控制器
TextEditingController realNameEditController = TextEditingController();
///证件号码文本框控制器
TextEditingController numIDEditController = TextEditingController();
String country = '中国';
String typeID = '身份证';
int idType = 1;
bool hasCertified;
String realName = '';
String idNumber = '';
@override
void initState() {
super.initState();
hasCertified = SpUtil.getBool(SP_BOOL_PERSON) ?? false;
if (hasCertified) {
realNameEditController.text = SpUtil.getString(SP_REAL_NAME);
numIDEditController.text = SpUtil.getString(SP_ID_NUMBER);
}
}
///跳转到选择国家/地区
void pushCountryChoice() {
Navigator.of(context).push(
CupertinoPageRoute(
builder: (BuildContext context) {
return CountryChoicePage();
},
),
).then((countryInfo) {
print('yujunhai model ${countryInfo.name}');
setState(() {
country = countryInfo.name;
changeIDType(country);
});
});
}
changeIDType(String name) {
switch (name) {
case '中国':
typeID = '身份证';
idType = 1;
break;
case '香港 (中国)':
case '澳门(中国)':
typeID = '港澳居民来往内地通行证';
idType = 2;
break;
case "台湾 (中国)":
typeID = '台湾居民来往内地通行证';
idType = 2;
break;
default:
typeID = '护照';
idType = 3;
break;
}
}
///进行身份认证
uploadCertificate() async {
realName = realNameEditController.text;
idNumber = numIDEditController.text;
if (realName == '' || idNumber == '') {
ToastUtil.show('真实姓名或证件号码不能为空');
return;
}
if (idNumber.length < 18) {
ToastUtil.show('证件号码位数错误');
return;
}
try {
var response = await uploadCertificateInfo(
locate: country,
realName: realNameEditController.text,
type: idType,
idNumber: numIDEditController.text,
);
if (response.errorCode == NO_ERROR_CODE) {
DialogUtil.showMyToastDialog(
context: context,
chidWidget: UploadDialogWidget(
title: '提交成功',
content: '请耐心等待工作人员审核',
imagePath: success,
),
duration: const Duration(milliseconds: 2000));
SpUtil.putBool(SP_BOOL_PERSON, true);
SpUtil.putString(SP_REAL_NAME, realName);
SpUtil.putString(SP_ID_NUMBER, idNumber);
Future.delayed(Duration(milliseconds: 2000)).then((value) {
Navigator.of(context).pop();
});
} else {
print('uploadCertificateInfo response $response');
DialogUtil.showSimpleDialog(
context: context, content: '${response.message}');
SpUtil.putBool(SP_BOOL_PERSON, false);
}
} catch (error) {
ToastUtil.show('上传失败 $error');
SpUtil.putBool(SP_BOOL_PERSON, false);
}
}
}
|
package io.prediction.algorithms.mahout.itemrec.knnitembased
import org.specs2.mutable._
import com.github.nscala_time.time.Imports._
import scala.io.Source
import java.io.File
import java.io.FileWriter
import java.io.BufferedWriter
import io.prediction.algorithms.mahout.itemrec.MahoutJob
import io.prediction.algorithms.mahout.itemrec.TestUtils
class KNNItemBasedJobSpec extends Specification {
val ratingsCSV = List(
"1,1,3",
"4,1,5",
"1,2,3",
"3,2,2",
"4,2,4",
"1,3,4",
"2,3,4",
"3,3,2",
"2,4,2",
"3,4,3",
"4,4,2"
)
val itemsIndexTSV = List(
s"1\ti1\tt1,t2\t12345000",
s"2\ti2\tt1\t12346000",
s"3\ti3\tt2,t3\t12346100",
s"4\ti4\tt3\t12347100"
)
val appid = 25
val engineid = 31
val algoid = 32
val jobName =
"io.prediction.algorithms.mahout.itemrec.knnitembased.KNNItemBasedJob"
"KNNItemBasedJob with unseenOnly=false" should {
val testDir = "/tmp/pio_test/KNNItemBasedJob/unseenOnlyfalse/"
val inputFile = s"${testDir}ratings.csv"
val itemsFile = s"${testDir}itemsIndex.tsv"
val outputFile = s"${testDir}predicted.tsv"
val outputSim = s"${testDir}sim.csv"
val testDirFile = new File(testDir)
testDirFile.mkdirs()
val jobArgs = Map(
"input" -> inputFile,
"itemsFile" -> itemsFile,
"output" -> outputFile,
"appid" -> appid,
"engineid" -> engineid,
"algoid" -> algoid,
"booleanData" -> false,
"numRecommendations" -> 5,
"itemSimilarity" -> "LogLikelihoodSimilarity",
"weighted" -> false,
"nearestN" -> 10,
"threshold" -> 4.9E-324,
"outputSim" -> outputSim,
"preComputeItemSim" -> false,
"unseenOnly" -> false,
"recommendationTime" -> DateTime.now.millis
)
TestUtils.writeToFile(ratingsCSV, inputFile)
TestUtils.writeToFile(itemsIndexTSV, itemsFile)
val predictedExpected = List(
"1\t[3:3.4408236,1:3.2995765,4:3.2805154,2:3.2180138]",
"2\t[3:3.338186,1:3.0,2:3.0,4:2.661814]",
"3\t[4:2.5027347,1:2.3333333,2:2.2486327,3:2.2486327]",
"4\t[2:3.905135,3:3.8779385,1:3.8016937,4:3.4595158]"
)
MahoutJob.main(Array(jobName) ++ TestUtils.argMapToArray(jobArgs))
"generate prediction output correctly" in {
val predicted = Source.fromFile(outputFile)
.getLines().toList
predicted must containTheSameElementsAs(predictedExpected)
}
}
"KNNItemBasedJob with unseenOnly=false and subset itemsIndex" should {
val testDir = "/tmp/pio_test/KNNItemBasedJob/unseenOnlyfalseSubSetItemsIndex/"
val inputFile = s"${testDir}ratings.csv"
val itemsFile = s"${testDir}itemsIndex.tsv"
val outputFile = s"${testDir}predicted.tsv"
val outputSim = s"${testDir}sim.csv"
val itemsIndexTSV = List(
s"2\ti2\tt1\t12346000",
s"4\ti4\tt3\t12347100"
)
val testDirFile = new File(testDir)
testDirFile.mkdirs()
val jobArgs = Map(
"input" -> inputFile,
"itemsFile" -> itemsFile,
"output" -> outputFile,
"appid" -> appid,
"engineid" -> engineid,
"algoid" -> algoid,
"booleanData" -> false,
"numRecommendations" -> 5,
"itemSimilarity" -> "LogLikelihoodSimilarity",
"weighted" -> false,
"nearestN" -> 10,
"threshold" -> 4.9E-324,
"outputSim" -> outputSim,
"preComputeItemSim" -> false,
"unseenOnly" -> false,
"recommendationTime" -> DateTime.now.millis
)
TestUtils.writeToFile(ratingsCSV, inputFile)
TestUtils.writeToFile(itemsIndexTSV, itemsFile)
val predictedExpected = List(
"1\t[4:3.2805154,2:3.2180138]",
"2\t[2:3.0,4:2.661814]",
"3\t[4:2.5027347,2:2.2486327]",
"4\t[2:3.905135,4:3.4595158]"
)
MahoutJob.main(Array(jobName) ++ TestUtils.argMapToArray(jobArgs))
"generate prediction output correctly" in {
val predicted = Source.fromFile(outputFile)
.getLines().toList
predicted must containTheSameElementsAs(predictedExpected)
}
}
"KNNItemBasedJob with unseenOnly=true" should {
val testDir = "/tmp/pio_test/KNNItemBasedJob/unseenOnlytrue/"
val inputFile = s"${testDir}ratings.csv"
val itemsFile = s"${testDir}itemsIndex.tsv"
val outputFile = s"${testDir}predicted.tsv"
val outputSim = s"${testDir}sim.csv"
val testDirFile = new File(testDir)
testDirFile.mkdirs()
val jobArgs = Map(
"input" -> inputFile,
"itemsFile" -> itemsFile,
"output" -> outputFile,
"appid" -> appid,
"engineid" -> engineid,
"algoid" -> algoid,
"booleanData" -> false,
"numRecommendations" -> 5,
"itemSimilarity" -> "LogLikelihoodSimilarity",
"weighted" -> false,
"nearestN" -> 10,
"threshold" -> 4.9E-324,
"outputSim" -> outputSim,
"preComputeItemSim" -> false,
"unseenOnly" -> true,
"recommendationTime" -> DateTime.now.millis
)
TestUtils.writeToFile(ratingsCSV, inputFile)
TestUtils.writeToFile(itemsIndexTSV, itemsFile)
val predictedExpected = List(
"1\t[4:3.2805154]",
"2\t[1:3.0,2:3.0]",
"3\t[1:2.3333333]",
"4\t[3:3.8779385]"
)
MahoutJob.main(Array(jobName) ++ TestUtils.argMapToArray(jobArgs))
"generate prediction output correctly" in {
val predicted = Source.fromFile(outputFile)
.getLines().toList
predicted must containTheSameElementsAs(predictedExpected)
}
}
"KNNItemBasedJob with unseenOnly=true and seenFile" should {
val testDir = "/tmp/pio_test/KNNItemBasedJob/unseenOnlytrueSeenFile/"
val inputFile = s"${testDir}ratings.csv"
val itemsFile = s"${testDir}itemsIndex.tsv"
val outputFile = s"${testDir}predicted.tsv"
val outputSim = s"${testDir}sim.csv"
val seenFile = s"${testDir}seen.csv"
val testDirFile = new File(testDir)
testDirFile.mkdirs()
val seenCSV = List(
"1,1",
"4,1",
"1,2"
)
val jobArgs = Map(
"input" -> inputFile,
"itemsFile" -> itemsFile,
"output" -> outputFile,
"appid" -> appid,
"engineid" -> engineid,
"algoid" -> algoid,
"booleanData" -> false,
"numRecommendations" -> 5,
"itemSimilarity" -> "LogLikelihoodSimilarity",
"weighted" -> false,
"nearestN" -> 10,
"threshold" -> 4.9E-324,
"outputSim" -> outputSim,
"preComputeItemSim" -> false,
"unseenOnly" -> true,
"seenFile" -> seenFile,
"recommendationTime" -> DateTime.now.millis
)
TestUtils.writeToFile(ratingsCSV, inputFile)
TestUtils.writeToFile(itemsIndexTSV, itemsFile)
TestUtils.writeToFile(seenCSV, seenFile)
val predictedExpected = List(
"1\t[3:3.4408236,4:3.2805154]",
"2\t[3:3.338186,1:3.0,2:3.0,4:2.661814]",
"3\t[4:2.5027347,1:2.3333333,2:2.2486327,3:2.2486327]",
"4\t[2:3.905135,3:3.8779385,4:3.4595158]"
)
MahoutJob.main(Array(jobName) ++ TestUtils.argMapToArray(jobArgs))
"generate prediction output correctly" in {
val predicted = Source.fromFile(outputFile)
.getLines().toList
predicted must containTheSameElementsAs(predictedExpected)
}
}
"KNNItemBasedJob with unseenOnly=true and seenFile and subset itemsIndex" should {
val testDir = "/tmp/pio_test/KNNItemBasedJob/unseenOnlytrueSeenFileSubSetItemsIndex/"
val inputFile = s"${testDir}ratings.csv"
val itemsFile = s"${testDir}itemsIndex.tsv"
val outputFile = s"${testDir}predicted.tsv"
val outputSim = s"${testDir}sim.csv"
val seenFile = s"${testDir}seen.csv"
val testDirFile = new File(testDir)
testDirFile.mkdirs()
val itemsIndexTSV = List(
s"1\ti1\tt1,t2\t12345000",
s"2\ti2\tt1\t12346000",
s"4\ti4\tt3\t12347100"
)
val seenCSV = List(
"1,1",
"4,1",
"1,2"
)
val jobArgs = Map(
"input" -> inputFile,
"itemsFile" -> itemsFile,
"output" -> outputFile,
"appid" -> appid,
"engineid" -> engineid,
"algoid" -> algoid,
"booleanData" -> false,
"numRecommendations" -> 5,
"itemSimilarity" -> "LogLikelihoodSimilarity",
"weighted" -> false,
"nearestN" -> 10,
"threshold" -> 4.9E-324,
"outputSim" -> outputSim,
"preComputeItemSim" -> false,
"unseenOnly" -> true,
"seenFile" -> seenFile,
"recommendationTime" -> DateTime.now.millis
)
TestUtils.writeToFile(ratingsCSV, inputFile)
TestUtils.writeToFile(itemsIndexTSV, itemsFile)
TestUtils.writeToFile(seenCSV, seenFile)
val predictedExpected = List(
"1\t[4:3.2805154]",
"2\t[2:3.0,1:3.0,4:2.661814]",
"3\t[4:2.5027347,1:2.3333333,2:2.2486327]",
"4\t[2:3.905135,4:3.4595158]"
)
MahoutJob.main(Array(jobName) ++ TestUtils.argMapToArray(jobArgs))
"generate prediction output correctly" in {
val predicted = Source.fromFile(outputFile)
.getLines().toList
predicted must containTheSameElementsAs(predictedExpected)
}
}
"KNNItemBasedJob with unseenOnly=true and empty seenFile" should {
val testDir = "/tmp/pio_test/KNNItemBasedJob/unseenOnlytrueEmptySeenFile/"
val inputFile = s"${testDir}ratings.csv"
val itemsFile = s"${testDir}itemsIndex.tsv"
val outputFile = s"${testDir}predicted.tsv"
val outputSim = s"${testDir}sim.csv"
val seenFile = s"${testDir}seen.csv"
val testDirFile = new File(testDir)
testDirFile.mkdirs()
val seenCSV = List()
val jobArgs = Map(
"input" -> inputFile,
"itemsFile" -> itemsFile,
"output" -> outputFile,
"appid" -> appid,
"engineid" -> engineid,
"algoid" -> algoid,
"booleanData" -> false,
"numRecommendations" -> 5,
"itemSimilarity" -> "LogLikelihoodSimilarity",
"weighted" -> false,
"nearestN" -> 10,
"threshold" -> 4.9E-324,
"outputSim" -> outputSim,
"preComputeItemSim" -> false,
"unseenOnly" -> true,
"seenFile" -> seenFile,
"recommendationTime" -> DateTime.now.millis
)
TestUtils.writeToFile(ratingsCSV, inputFile)
TestUtils.writeToFile(itemsIndexTSV, itemsFile)
TestUtils.writeToFile(seenCSV, seenFile)
val predictedExpected = List(
"1\t[3:3.4408236,1:3.2995765,4:3.2805154,2:3.2180138]",
"2\t[3:3.338186,1:3.0,2:3.0,4:2.661814]",
"3\t[4:2.5027347,1:2.3333333,2:2.2486327,3:2.2486327]",
"4\t[2:3.905135,3:3.8779385,1:3.8016937,4:3.4595158]"
)
MahoutJob.main(Array(jobName) ++ TestUtils.argMapToArray(jobArgs))
"generate prediction output correctly" in {
val predicted = Source.fromFile(outputFile)
.getLines().toList
predicted must containTheSameElementsAs(predictedExpected)
}
}
// TODO: add more tests...
}
|
package com.ttulka.ecommerce.common.events
/**
* A Domain Event is a role, and thus should be represented explicitly.
*/
interface DomainEvent
|
import { createDocClient } from './createDynamodbDocClient';
const docClient = createDocClient();
const params = {
TableName: 'Movies',
ExclusiveStartKey: null,
ProjectionExpression: '#yr, title, info.rating',
FilterExpression: '#yr between :start_yr and :end_yr',
ExpressionAttributeNames: {
'#yr': 'year',
},
ExpressionAttributeValues: {
':start_yr': 1950,
':end_yr': 1959,
},
};
console.log('Scanning Movies table.');
docClient.scan(params, onScan);
function onScan(err, data) {
if (err) {
console.error(
'Unable to scan the table. Error JSON:',
JSON.stringify(err, null, 2),
);
} else {
// print all the movies
console.log('Scan succeeded.');
data.Items.forEach(function (movie) {
console.log(
movie.year + ': ',
movie.title,
'- rating:',
movie.info.rating,
);
});
// continue scanning if we have more movies, because
// scan can retrieve a maximum of 1MB of data
if (typeof data.LastEvaluatedKey != 'undefined') {
console.log('Scanning for more...');
params.ExclusiveStartKey = data.LastEvaluatedKey;
docClient.scan(params, onScan);
}
}
}
|
package io.codelabs.zenitech.widget
import android.content.Context
import android.graphics.Color
import android.util.AttributeSet
import android.view.View
import android.widget.FrameLayout
import androidx.appcompat.widget.AppCompatTextView
import io.codelabs.zenitech.R
/**
* Composite view to show an item containing a text label and a [ColorDotView].
*/
class ColorAttributeView @JvmOverloads constructor(
context: Context,
attrs: AttributeSet? = null,
defStyleAttr: Int = 0,
defStyleRes: Int = 0
) : FrameLayout(context, attrs, defStyleAttr, defStyleRes) {
private val colorAttributeTextView: AppCompatTextView
private val colorDotView: ColorDotView
private var attributeText: String = ""
set(value) {
colorAttributeTextView.text = value
field = value
}
private var dotFillColor: Int = Color.LTGRAY
set(value) {
colorDotView.fillColor = value
field = value
}
private var dotStrokeColor: Int = Color.DKGRAY
set(value) {
colorDotView.strokeColor = value
field = value
}
init {
val view = View.inflate(context, R.layout.color_attribute_view_layout, this)
colorAttributeTextView = view.findViewById(R.id.color_attribute)
colorDotView = view.findViewById(R.id.color_dot)
val a = context.theme.obtainStyledAttributes(
attrs,
R.styleable.ColorAttributeView,
defStyleAttr,
defStyleRes
)
attributeText = a.getString(
R.styleable.ColorAttributeView_android_text
) ?: attributeText
dotFillColor = a.getColor(R.styleable.ColorAttributeView_colorFillColor, dotFillColor)
dotStrokeColor = a.getColor(
R.styleable.ColorAttributeView_colorStrokeColor,
dotStrokeColor
)
a.recycle()
}
}
|
using System;
using System.Collections.Generic;
using System.Text;
using YouZan.Open.Common.Extensions.Attributes;
namespace YouZan.Open.Api.Entry.Request.Users
{
public class UserWeiXinOpenIdGetRequest : YouZanRequest
{
/// <summary>
/// 手机号国际码,当前仅支持国内手机号
/// </summary>
[ApiField("country_code")]
public string CountryCode { get; set; }
/// <summary>
/// 手机号
/// </summary>
[ApiField("mobile")]
public string Mobile { get; set; }
/// <summary>
/// 类型
/// 1、表示公众号
/// 2、表示小程序
/// </summary>
[ApiField("wechat_type")]
public int WechatType { get; set; }
}
}
|
using System.Collections.Generic;
using System.Linq;
using JetBrains.Metadata.Reader.API;
using JetBrains.Util;
using Xunit.Abstractions;
namespace XunitContrib.Runner.ReSharper.UnitTestProvider
{
// TODO: Cache and reuse objects
public class MetadataAssemblyInfoAdapter : IAssemblyInfo
{
private readonly IMetadataAssembly assembly;
public MetadataAssemblyInfoAdapter(IMetadataAssembly assembly)
{
this.assembly = assembly;
}
public IEnumerable<IAttributeInfo> GetCustomAttributes(string assemblyQualifiedAttributeTypeName)
{
var fullName = assemblyQualifiedAttributeTypeName.Substring(0,
assemblyQualifiedAttributeTypeName.IndexOf(','));
return assembly.GetCustomAttributes(fullName)
.Select(a => (IAttributeInfo)new MetadataAttributeInfoAdapter2(a));
}
public ITypeInfo GetType(string typeName)
{
var metadataType = assembly.GetTypeFromQualifiedName(typeName, false);
Assertion.Assert(metadataType is IMetadataClassType, "Expected type to be IMetadataClassType: {0}",
metadataType.GetType().Name);
// I'd like to assert that the type is resolved, but it doesn't work for closed generics!
//Assertion.Assert(metadataType.IsResolved, "Cannot resolve type: {0}", metadataType);
//if (!metadataType.IsResolved)
// return null;
return new MetadataTypeInfoAdapter2((IMetadataClassType) metadataType);
}
public IEnumerable<ITypeInfo> GetTypes(bool includePrivateTypes)
{
// Still don't know why I can't use assembly.GetExportedTypes()
return from typeInfo in assembly.GetTypes()
where includePrivateTypes || typeInfo.IsPublic || typeInfo.IsNestedPublic
select (ITypeInfo)new MetadataTypeInfoAdapter2(typeInfo);
}
public string AssemblyPath
{
get { return assembly.Location.FullPath; }
}
public string Name
{
get { return assembly.AssemblyName.FullName; }
}
}
}
|
/*-
* Copyright (c) 2019 TAO Zhijiang<[email protected]>
*
* Licensed under the BSD-3-Clause license, see LICENSE for full information.
*
*/
#include <sys/types.h>
#include <ifaddrs.h>
#include <sys/socket.h>
#include <netdb.h>
#include "zkPath.h"
namespace Clotho {
// 获取本主机的IP地址信息,出去回环、非法地址
std::vector<std::string> zkPath::get_local_ips() {
struct ifaddrs* ifaddr = NULL;
int family = 0;
int ec = 0;
char host[NI_MAXHOST]{};
std::vector<std::string> ips{};
if (getifaddrs(&ifaddr) == -1) {
fprintf(stderr, "getifaddrs failed. errno: %d:%s", errno, strerror(errno));
return ips;
}
for (struct ifaddrs* ifa = ifaddr; ifa != NULL; ifa = ifa->ifa_next) {
if (ifa->ifa_addr == NULL)
continue;
if (ifa->ifa_name == NULL || ::strncasecmp(ifa->ifa_name, "lo", strlen("lo")) == 0)
continue;
family = ifa->ifa_addr->sa_family;
if (family != AF_INET)
continue;
ec = getnameinfo(ifa->ifa_addr, sizeof(struct sockaddr_in),
host, NI_MAXHOST, NULL, 0, NI_NUMERICHOST);
if (ec != 0) {
fprintf(stderr, "getnameinfo() failed: %s", gai_strerror(ec));
continue;
}
if (strncmp(host, "127", strlen("127")) == 0 || strncmp(host, "169.254", strlen("169.254") == 0)) {
// ignore localhost, invalid addr
continue;
}
ips.push_back(host);
}
freeifaddrs(ifaddr);
return ips;
}
std::string zkPath::get_local_ip() {
std::vector<std::string> ips = get_local_ips();
if (!ips.empty())
return ips[::random() % ips.size()];
return "";
}
enum PathType zkPath::guess_path_type(const std::string& path) {
std::string n_path = normalize_path(path);
std::vector<std::string> items{};
if (n_path.empty() || n_path.at(0) != '/')
return PathType::kUndetected;
split(n_path, "/", items);
if (items.empty())
return PathType::kUndetected;
if (items.size() == 1) {
return PathType::kDepartment;
} else if (items.size() == 2) {
return PathType::kService;
} else if (items.size() == 3) {
if (validate_node(items[2]))
return PathType::kNode;
return PathType::kServiceProperty;
} else if (items.size() == 4) {
if (validate_node(items[2]))
return PathType::kNodeProperty;
}
return PathType::kUndetected;
}
void zkPath::split(const std::string& str,
const std::string& needle, std::vector<std::string>& vec) {
std::string::size_type pos = 0;
std::string::size_type oldPos = 0;
while (true) {
pos = str.find_first_of(needle, oldPos);
if (std::string::npos == pos) {
auto item = str.substr(oldPos);
if (!item.empty())
vec.push_back(item);
break;
}
auto item = str.substr(oldPos, pos - oldPos);
if (!item.empty())
vec.push_back(item);
oldPos = pos + 1;
}
}
// 优化路径名字,包括:删除空白字符,删除中间连续的以及末尾的 '/'
std::string zkPath::normalize_path(const std::string& str) {
std::string copy_str = str;
size_t index = 0;
// trim left whitespace
for (index = 0; index < copy_str.size() && isspace(copy_str[index]); ++index)
/* do nothing*/;
copy_str.erase(0, index);
// trim right whitespace
for (index = copy_str.size(); index > 0 && isspace(copy_str[index - 1]); --index)
/* do nothing*/;
copy_str.erase(index);
std::string result{};
for (size_t i = 0; i < copy_str.size(); ++i) {
if (copy_str[i] == '/' && !result.empty() && result.at(result.size() - 1) == '/') {
continue;
}
result.push_back(copy_str[i]);
}
if (!result.empty() && result.at(result.size() - 1) == '/')
result.erase(result.size() - 1);
return result;
}
// ip:port node_name strict
// 0.0.0.0:1000 是合法的地址
bool zkPath::validate_node(const std::string& node_name) {
std::vector<std::string> vec{};
split(node_name, ":.", vec);
if (vec.size() != 5)
return false;
for (size_t i = 0; i < 4; ++i) {
int num = ::atoi(vec[i].c_str());
if (num < 0 || num > std::numeric_limits<uint8_t>::max())
return false;
}
int port = ::atoi(vec[4].c_str());
if (port <= 0 || port >= std::numeric_limits<uint16_t>::max())
return false;
return true;
}
// ip:port node_name strict
bool zkPath::validate_node(const std::string& node_name, std::string& ip, uint16_t& port) {
std::vector<std::string> vec{};
split(node_name, ":.", vec);
if (vec.size() != 5)
return false;
for (size_t i = 0; i < 4; ++i) {
int num = ::atoi(vec[i].c_str());
if (num < 0 || num > std::numeric_limits<uint8_t>::max())
return false;
}
port = ::atoi(vec[4].c_str());
if (port <= 0 || port >= std::numeric_limits<uint16_t>::max())
return false;
char ip_str[32]{};
snprintf(ip_str, sizeof(ip_str), "%s.%s.%s.%s",
vec[0].c_str(), vec[1].c_str(), vec[2].c_str(), vec[3].c_str());
ip = ip_str;
return true;
}
} // end namespace Clotho
|
cp /course/cs5700sp17/popular_raw.html . &&
python popular_to_text.py &&
rm popular_raw.html
|
package com.dsc.hosp.rabbitmq.constant;
/**
* @author MQConst
*/
public class MqConst {
public static final String EXCHANGE_DIRECT_ORDER="exchange.direct.order";
public static final String ROUTING_ORDER="order";
public static final String QUEUE_ORDER="queue.order";
public static final String EXCHANGE_DIRECT_MSM="exchange.direct.msm";
public static final String ROUTING_MSM_ITEM="msm.item";
public static final String QUEUE_MSM_ITEM="queue.msm.item";
public static final String QUEUE_TASK_8 = "queue.task.8";
public static final String EXCHANGE_DIRECT_TASK = "exchange.direct.task";
public static final String ROUTING_TASK_8 = "task.8";
}
|
package com.github.scaruby.collection
import scala.collection.mutable
import org.scalatest.diagrams.Diagrams
import org.scalatest.funspec.AnyFunSpec
class RichMapSpec extends AnyFunSpec with Diagrams {
describe("RichMap") {
val source = Map("X" -> 1, "Y" -> 2, "Z" -> 3)
it("keyOf") {
assert(Some("X") == source.keyOf(1))
assert(Some("Y") == source.keyOf(2))
assert(Some("Z") == source.keyOf(3))
assert(None == source.keyOf(4))
}
it("valuesOf") {
assert(Seq(1, 2) == source.valuesOf("X", "Y"))
assert(Seq(1, 3) == source.valuesOf("X", "Z"))
assert(Seq(3, 1) == source.valuesOf("Z", "X"))
assert(Seq(1, 2, 3) == source.valuesOf("X", "Y", "Z"))
}
it("invert") {
assert(Map(1 -> "X", 2 -> "Y", 3 -> "Z") == source.invert)
}
}
}
|
/**
* (c) Meta Platforms, Inc. and affiliates. Confidential and proprietary.
* This file is generated. Do not modify it manually!
* @codegen-command : phps RepoSync intl_oss_fbt
* @codegen-source : fbsource/xplat/intl/oss-fbt/packages/react-native-fbt/js/NativeFbtModule.js
* @generated SignedSource<<96f13b0ee798f970d5e23c3d9873fe80>>
* Copyright (c) Meta Platforms, Inc. and affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*
* @flow strict-local
* @format
*/
'use strict';
import type {TurboModule} from 'react-native/Libraries/TurboModule/RCTExport';
import {TurboModuleRegistry} from 'react-native';
export interface Spec extends TurboModule {
getString: (hashKey: string) => string;
}
export default (TurboModuleRegistry.get<Spec>('FbtModule'): ?Spec);
|
<?php
/*
* This file is part of Totara Learn
*
* Copyright (C) 2021 onwards Totara Learning Solutions LTD
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 3 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*
* @author Oleg Demeshev <[email protected]>
* @package mod_facetoface
*/
use mod_facetoface\userdata\room_virtualmeeting;
use totara_userdata\userdata\item;
use totara_userdata\userdata\target_user;
defined('MOODLE_INTERNAL') || die();
require_once(__DIR__ . '/virtualmeeting_testcase.php');
class mod_facetoface_userdata_room_virtualmeeting_testcase extends mod_facetoface_virtualmeeting_testcase {
/** @var target_user */
private $targetuser1;
/** @var target_user */
private $targetuser2;
public function setUp(): void {
parent::setUp();
$this->targetuser1 = new target_user($this->user1->to_record());
$this->targetuser2 = new target_user($this->user2->to_record());
}
public function tearDown(): void {
$this->targetuser1 = null;
$this->targetuser2 = null;
parent::tearDown();
}
/**
* Test count.
*/
public function test_count() {
// System context
$this->assertEquals(2, room_virtualmeeting::execute_count($this->targetuser1, context_system::instance()));
$this->assertEquals(0, room_virtualmeeting::execute_count($this->targetuser2, context_system::instance()));
// Module context
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$modulecontext = context_module::instance($coursemodule->id);
$this->assertEquals(2, room_virtualmeeting::execute_count($this->targetuser1, $modulecontext));
$this->assertEquals(0, room_virtualmeeting::execute_count($this->targetuser2, $modulecontext));
// Course context
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$coursecontext = context_course::instance($coursemodule->course);
$this->assertEquals(2, room_virtualmeeting::execute_count($this->targetuser1, $coursecontext));
$this->assertEquals(0, room_virtualmeeting::execute_count($this->targetuser2, $coursecontext));
// Course category context
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$course = get_course($coursemodule->course);
$coursecatcontext = context_coursecat::instance($course->category);
$this->assertEquals(2, room_virtualmeeting::execute_count($this->targetuser1, $coursecatcontext));
$this->assertEquals(0, room_virtualmeeting::execute_count($this->targetuser2, $coursecatcontext));
}
/**
* Test export.
*/
public function test_export() {
// System content.
$export = room_virtualmeeting::execute_export($this->targetuser1, context_system::instance());
$data = $export->data;
$this->assertCount(2, $data);
$record = array_shift($data);
$this->assertEquals($this->targetuser1->id, $record->userid);
$this->assertEquals('vroom1', $record->name);
$this->assertEquals('poc_app', $record->plugin);
$this->assertNotEmpty($record->description);
// Module context
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$modulecontext = context_module::instance($coursemodule->id);
$export = room_virtualmeeting::execute_export($this->targetuser1, $modulecontext);
$data = $export->data;
$this->assertCount(2, $data);
$record = array_shift($data);
$this->assertEquals($this->targetuser1->id, $record->userid);
$this->assertEquals('vroom1', $record->name);
$this->assertEquals('poc_app', $record->plugin);
$this->assertNotEmpty($record->description);
// Course context
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$coursecontext = context_course::instance($coursemodule->course);
$export = room_virtualmeeting::execute_export($this->targetuser1, $coursecontext);
$data = $export->data;
$this->assertCount(2, $data);
$record = array_shift($data);
$this->assertEquals($this->targetuser1->id, $record->userid);
$this->assertEquals('vroom1', $record->name);
$this->assertEquals('poc_app', $record->plugin);
$this->assertNotEmpty($record->description);
// Course category context
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$course = get_course($coursemodule->course);
$coursecatcontext = context_coursecat::instance($course->category);
$export = room_virtualmeeting::execute_export($this->targetuser1, $coursecatcontext);
$data = $export->data;
$this->assertCount(2, $data);
$record = array_shift($data);
$this->assertEquals($this->targetuser1->id, $record->userid);
$this->assertEquals('vroom1', $record->name);
$this->assertEquals('poc_app', $record->plugin);
$this->assertNotEmpty($record->description);
$export = room_virtualmeeting::execute_export($this->targetuser2, context_system::instance());
$data = $export->data;
$this->assertEmpty($data);
}
public function test_purge_context_system() {
global $DB;
$status = room_virtualmeeting::execute_purge($this->targetuser2, context_system::instance());
$this->assertEquals(item::RESULT_STATUS_SUCCESS, $status);
$this->assertEquals(2, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user1->id]));
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user2->id]));
$status = room_virtualmeeting::execute_purge($this->targetuser1, context_system::instance());
$this->assertEquals(item::RESULT_STATUS_SUCCESS, $status);
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user1->id]));
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user2->id]));
}
public function test_purge_context_module() {
global $DB;
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$modulecontext = context_module::instance($coursemodule->id);
$status = room_virtualmeeting::execute_purge($this->targetuser2, $modulecontext);
$this->assertEquals(item::RESULT_STATUS_SUCCESS, $status);
$this->assertEquals(2, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user1->id]));
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user2->id]));
$status = room_virtualmeeting::execute_purge($this->targetuser1, $modulecontext);
$this->assertEquals(item::RESULT_STATUS_SUCCESS, $status);
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user1->id]));
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user2->id]));
}
public function test_purge_context_course() {
global $DB;
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$coursecontext = context_course::instance($coursemodule->course);
$status = room_virtualmeeting::execute_purge($this->targetuser2, $coursecontext);
$this->assertEquals(item::RESULT_STATUS_SUCCESS, $status);
$this->assertEquals(2, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user1->id]));
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user2->id]));
$status = room_virtualmeeting::execute_purge($this->targetuser1, $coursecontext);
$this->assertEquals(item::RESULT_STATUS_SUCCESS, $status);
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user1->id]));
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user2->id]));
}
public function test_purge_context_course_category() {
global $DB;
$coursemodule = $this->event1->get_seminar()->get_coursemodule();
$course = get_course($coursemodule->course);
$coursecatcontext = context_coursecat::instance($course->category);
$status = room_virtualmeeting::execute_purge($this->targetuser2, $coursecatcontext);
$this->assertEquals(item::RESULT_STATUS_SUCCESS, $status);
$this->assertEquals(2, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user1->id]));
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user2->id]));
$status = room_virtualmeeting::execute_purge($this->targetuser1, $coursecatcontext);
$this->assertEquals(item::RESULT_STATUS_SUCCESS, $status);
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user1->id]));
$this->assertEquals(0, $DB->count_records('facetoface_room_virtualmeeting', ['userid' => $this->user2->id]));
}
}
|
!c crossmul - cross multiply two files, one conjugated, form int and amp file
subroutine crossmul(cst, slcAccessor1, slcAccessor2, ifgAccessor, ampAccessor) BIND(C,name='crossmul_f')
use, intrinsic :: iso_c_binding
use crossmulState
implicit none
include 'omp_lib.h'
type(crossmulType):: cst
integer (C_INT64_T) slcAccessor1
integer (C_INT64_T) slcAccessor2
integer (C_INT64_T) ifgAccessor
integer (C_INT64_T) ampAccessor
complex*8, allocatable:: in1(:,:),in2(:,:)
complex*8, allocatable:: igram(:,:,:),amp(:,:,:)
complex*8, allocatable:: up1(:,:,:),up2(:,:,:)
complex*8, allocatable:: inline1(:,:),inline2(:,:)
complex*8, allocatable:: igramacc(:,:),ampacc(:,:)
complex*8, allocatable:: igramtemp(:,:),amptemp(:,:)
integer n, i, j, k, nnn, line
integer nblocks, iblk, nl, ith
!!!!!!For now, making local copies
!!!!!!Could access anywhere in code using cst%
integer :: na, nd, looksac, looksdn, blocksize
double precision:: scale
na = cst%na
nd = cst%nd
looksac = cst%looksac
looksdn = cst%looksdn
blocksize = cst%blocksize
scale = cst%scale
!$omp parallel
n=omp_get_num_threads()
!$omp end parallel
print *, 'Max threads used: ', n
!c get ffts lengths for upsampling
do i=1,16
nnn=2**i
if(nnn.ge.na)go to 11
end do
11 print *,'FFT length: ',nnn
call cfft1d_jpl(nnn, igramacc, 0) !c Initialize FFT plan
call cfft1d_jpl(2*nnn, igramacc, 0)
!c Number of blocks needed
nblocks = CEILING(nd/(1.0*blocksize*looksdn))
print *, 'Overall:', nd, blocksize*looksdn, nblocks
allocate(in1(na,looksdn*blocksize), in2(na,looksdn*blocksize))
allocate(igramtemp(na/looksac,blocksize), amptemp(na/looksac,blocksize))
!c allocate the local arrays
allocate (igram(na*2,looksdn,n),amp(na*2,looksdn,n))
allocate (igramacc(na,n),ampacc(na,n))
allocate (up1(nnn*2,looksdn,n),up2(nnn*2,looksdn,n),inline1(nnn,n),inline2(nnn,n))
do iblk=1, nblocks
k = (iblk-1)*blocksize*looksdn+1
in1 = cmplx(0., 0.)
in2 = cmplx(0., 0.)
igramtemp = cmplx(0., 0.)
amptemp = cmplx(0., 0.)
if (iblk.ne.nblocks) then
nl = looksdn*blocksize
else
nl = (nd - (nblocks-1)*blocksize*looksdn)
endif
!c print *, 'Block: ', iblk, k, nl
do j=1, nl
call getLineSequential(slcAccessor1,in1(:,j),k)
end do
if (slcAccessor1.ne.slcAccessor2) then
do j=1, nl
call getLineSequential(slcAccessor2,in2(:,j),k)
end do
else
in2 = in1
endif
in1 = in1*scale
in2 = in2*scale
!$omp parallel do private(j,k,i,line,ith) &
!$omp shared(in1,in2,igramtemp,amptemp,nl) &
!$omp shared(looksdn,looksac,scale,na,nnn, nd)&
!$omp shared(up1,up2,inline1,inline2,igram,amp)&
!$omp shared(igramacc,ampacc,n)
do line=1,nl/looksdn
! get thread number
ith = omp_get_thread_num() + 1
up1(:,:,ith)=cmplx(0.,0.) ! upsample file 1
do i=1,looksdn
inline1(1:na,ith)=in1(:,i+(line-1)*looksdn)
inline1(na+1:nnn, ith)=cmplx(0.,0.)
call cfft1d_jpl(nnn, inline1(1,ith), -1)
up1(1:nnn/2,i,ith)=inline1(1:nnn/2,ith)
up1(2*nnn-nnn/2+1:2*nnn,i,ith)=inline1(nnn/2+1:nnn,ith)
call cfft1d_jpl(2*nnn, up1(1,i,ith), 1)
end do
up1(:,:,ith)=up1(:,:,ith)/nnn
up2(:,:,ith)=cmplx(0.,0.) ! upsample file 2
do i=1,looksdn
inline2(1:na,ith)=in2(:,i+(line-1)*looksdn)
inline2(na+1:nnn,ith)=cmplx(0.,0.)
call cfft1d_jpl(nnn, inline2(1,ith), -1)
up2(1:nnn/2,i,ith)=inline2(1:nnn/2,ith)
up2(2*nnn-nnn/2+1:2*nnn,i,ith)=inline2(nnn/2+1:nnn,ith)
call cfft1d_jpl(2*nnn, up2(1,i,ith), 1)
end do
up2(:,:,ith)=up2(:,:,ith)/nnn
igram(1:na*2,:,ith)=up1(1:na*2,:,ith)*conjg(up2(1:na*2,:,ith))
amp(1:na*2,:,ith)=cmplx(cabs(up1(1:na*2,:,ith))**2,cabs(up2(1:na*2,:,ith))**2)
!c reclaim the extra two across looks first
do j=1,na
igram(j,:,ith) = igram(j*2-1,:,ith)+igram(j*2,:,ith)
amp(j,:,ith) = amp(j*2-1,:,ith)+amp(j*2,:,ith)
end do
!c looks down
igramacc(:,ith)=sum(igram(1:na,:,ith),2)
ampacc(:, ith)=sum(amp(1:na,:,ith),2)
!c looks across
do j=0,na/looksac-1
do k=1,looksac
igramtemp(j+1,line)=igramtemp(j+1,line)+igramacc(j*looksac+k,ith)
amptemp(j+1, line)=amptemp(j+1,line)+ampacc(j*looksac+k,ith)
end do
amptemp(j+1, line)=cmplx(sqrt(real(amptemp(j+1, line))),sqrt(aimag(amptemp(j+1, line))))
end do
end do
!$omp end parallel do
do line=1, nl/looksdn
call setLineSequential(ifgAccessor,igramtemp(1,line))
call setLineSequential(ampAccessor,amptemp(1,line))
end do
enddo
deallocate (up1,up2,igramacc,ampacc,inline1,inline2,igram,amp)
deallocate(in1, in2, igramtemp, amptemp)
call cfft1d_jpl(nnn, igramacc, 2) !c Uninitialize FFT plan
call cfft1d_jpl(2*nnn, igramacc, 2)
end
|
use crate::shared::config::AppConfig;
use crate::shared::PlainContext;
use crate::shared::{argon2_verify, sha512};
use ara_error::{ApiError, BoxedError};
use ara_model::core::{User, UserCredential};
use ara_model::db::{tx, Connection};
use chrono::Utc;
use failure::Fail;
use serde::Serialize;
pub fn login(
context: &dyn PlainContext,
username: &str,
password: &str,
) -> Result<User, LoginError> {
tx(context.db(), |conn| {
let config = AppConfig::get();
login_internal(
conn,
username,
password,
config.security.secret_key.as_bytes(),
)
})
}
fn login_internal(
conn: &Connection,
username: &str,
password: &str,
secret_key: &[u8],
) -> Result<User, LoginError> {
let user = User::find_by_username(conn, username)?
.ok_or_else(|| LoginErrorKind::InvalidUsernameOrPassword)?;
let user_credential = UserCredential::find_by_id(conn, user.id)?
.ok_or_else(|| LoginErrorKind::InvalidUsernameOrPassword)?;
let hash = user_credential
.password_hash
.as_ref()
.ok_or_else(|| LoginErrorKind::InvalidUsernameOrPassword)?
.as_str();
let password_sha512 = sha512(password.as_ref());
let valid = argon2_verify(&password_sha512, secret_key, &hash)?;
if !valid {
User::increment_failed_login_count(conn, user.id)?;
Err(LoginErrorKind::InvalidUsernameOrPassword)?;
} else if !user.active {
Err(LoginErrorKind::AccountNotActivated)?;
} else if user_credential
.expires_at
.map(|exp| exp > Utc::now())
.unwrap_or(false)
{
Err(LoginErrorKind::PasswordExpired)?;
} else if user_credential.invalid_attempts > 0 {
User::reset_failed_login_count(conn, user.id)?;
}
Ok(User::from(user))
}
#[derive(Debug, Serialize, Fail, ApiError)]
pub enum LoginErrorKind {
#[fail(display = "Invalid username or password")]
#[api_error(http(401))]
InvalidUsernameOrPassword,
#[fail(display = "Account is currently locked")]
#[api_error(http(401))]
AccountLocked,
#[fail(display = "Account is not activated")]
#[api_error(http(401))]
AccountNotActivated,
#[fail(display = "Password expired")]
#[api_error(http(401))]
PasswordExpired,
#[fail(display = "Internal error")]
#[api_error(map_from(Error), http(500))]
Internal(BoxedError),
}
|
#!/bin/bash -ue
set -o errexit
set -o pipefail
set -o nounset
# Since Linux build job with sole task to verify code formatting
if [ ! -z ${CLANGFORMAT+x} ] && [ "$CLANGFORMAT" == "ON" ]; then
./utility/format.sh
dirty=$(git ls-files --modified)
if [[ $dirty ]]; then
echo "Files with unexpected source code formatting:"
echo $dirty
exit 1
else
echo "All files verified for expected source code formatting"
exit 0
fi
fi
if [[ -z ${BUILD_SHARED_LIBS+v} ]]; then
export BUILD_SHARED_LIBS=OFF
fi
if [[ -z ${ENABLE_UNICODE+v} ]]; then
export ENABLE_UNICODE=OFF
fi
if [[ -z ${ENABLE_BOOST+v} ]]; then
export ENABLE_BOOST=OFF
fi
if [[ -z ${ENABLE_COVERAGE+v} ]] || [[ ! -z ${COVERITY+v} ]] || [[ "$TRAVIS_OS_NAME" == "osx" ]]; then
export ENABLE_COVERAGE=OFF
fi
if [[ -z ${DISABLE_LIBCXX+v} ]]; then
export DISABLE_LIBCXX=OFF
fi
if [[ -z ${DISABLE_EXAMPLES+v} ]]; then
export DISABLE_EXAMPLES=ON
fi
mkdir -p build
cd build
cmake -DCMAKE_BUILD_TYPE=Release \
-DBUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} \
-DNANODBC_ENABLE_COVERAGE=${ENABLE_COVERAGE} \
-DNANODBC_ENABLE_UNICODE=${ENABLE_UNICODE} \
-DNANODBC_ENABLE_BOOST=${ENABLE_BOOST} \
-DNANODBC_DISABLE_LIBCXX=${DISABLE_LIBCXX} \
-DNANODBC_DISABLE_EXAMPLES=${DISABLE_EXAMPLES} \
..
make
cd test
make ${DB}_tests
ctest -VV --output-on-failure -R ${DB}_tests
|
// Package upgrade implements the node upgrade backend.
//
// After submitting an upgrade descriptor, the old node may continue
// running or be restarted up to the point when the consensus layer reaches
// the upgrade epoch. The new node may not be started until the old node has
// reached the upgrade epoch.
package upgrade
import (
"context"
"fmt"
"sync"
beacon "github.com/oasisprotocol/oasis-core/go/beacon/api"
"github.com/oasisprotocol/oasis-core/go/common/cbor"
"github.com/oasisprotocol/oasis-core/go/common/logging"
"github.com/oasisprotocol/oasis-core/go/common/persistent"
"github.com/oasisprotocol/oasis-core/go/upgrade/api"
"github.com/oasisprotocol/oasis-core/go/upgrade/migrations"
)
var (
_ api.Backend = (*upgradeManager)(nil)
metadataStoreKey = []byte("descriptors")
)
type upgradeManager struct {
sync.Mutex
store *persistent.ServiceStore
pending []*api.PendingUpgrade
dataDir string
logger *logging.Logger
}
// Implements api.Backend.
func (u *upgradeManager) SubmitDescriptor(ctx context.Context, descriptor *api.Descriptor) error {
if descriptor == nil {
return api.ErrBadDescriptor
}
u.Lock()
defer u.Unlock()
for _, pu := range u.pending {
if pu.Descriptor.Equals(descriptor) {
return api.ErrAlreadyPending
}
}
pending := &api.PendingUpgrade{
Versioned: cbor.NewVersioned(api.LatestPendingUpgradeVersion),
Descriptor: descriptor,
}
u.pending = append(u.pending, pending)
u.logger.Info("received upgrade descriptor, scheduling shutdown",
"handler", pending.Descriptor.Handler,
"epoch", pending.Descriptor.Epoch,
)
return u.flushDescriptorLocked()
}
// Implements api.Backend.
func (u *upgradeManager) PendingUpgrades(ctx context.Context) ([]*api.PendingUpgrade, error) {
u.Lock()
defer u.Unlock()
return append([]*api.PendingUpgrade{}, u.pending...), nil
}
// Implements api.Backend.
func (u *upgradeManager) HasPendingUpgradeAt(ctx context.Context, height int64) (bool, error) {
u.Lock()
defer u.Unlock()
if height == api.InvalidUpgradeHeight {
return false, fmt.Errorf("invalid upgrade height specified")
}
for _, pu := range u.pending {
if pu.IsCompleted() || pu.UpgradeHeight == api.InvalidUpgradeHeight || pu.UpgradeHeight != height {
continue
}
return true, nil
}
return false, nil
}
// Implements api.Backend.
func (u *upgradeManager) CancelUpgrade(ctx context.Context, descriptor *api.Descriptor) error {
if descriptor == nil {
return api.ErrBadDescriptor
}
u.Lock()
defer u.Unlock()
if len(u.pending) == 0 {
// Make sure nothing is saved.
return u.flushDescriptorLocked()
}
var pending []*api.PendingUpgrade
for _, pu := range u.pending {
if !pu.Descriptor.Equals(descriptor) {
pending = append(pending, pu)
continue
}
if pu.UpgradeHeight != api.InvalidUpgradeHeight || pu.HasAnyStages() {
return api.ErrUpgradeInProgress
}
}
oldPending := u.pending
u.pending = pending
if err := u.flushDescriptorLocked(); err != nil {
u.pending = oldPending
return err
}
return nil
}
// Implements api.Backend.
func (u *upgradeManager) GetUpgrade(ctx context.Context, descriptor *api.Descriptor) (*api.PendingUpgrade, error) {
if descriptor == nil {
return nil, api.ErrBadDescriptor
}
u.Lock()
defer u.Unlock()
for _, pu := range u.pending {
if pu.Descriptor.Equals(descriptor) {
return pu, nil
}
}
return nil, api.ErrUpgradeNotFound
}
func (u *upgradeManager) checkStatus() error {
u.Lock()
defer u.Unlock()
var err error
if err = u.store.GetCBOR(metadataStoreKey, &u.pending); err != nil {
u.pending = nil
if err == persistent.ErrNotFound {
// No upgrade pending, nothing to do.
u.logger.Debug("no pending descriptors, continuing startup")
return nil
}
return fmt.Errorf("can't decode stored upgrade descriptors: %w", err)
}
for _, pu := range u.pending {
if pu.IsCompleted() {
continue
}
// Check if upgrade should proceed.
if pu.UpgradeHeight == api.InvalidUpgradeHeight {
continue
}
// The upgrade should proceed right now. Check that we have the right binary.
if err = pu.Descriptor.EnsureCompatible(); err != nil {
u.logger.Error("incompatible binary version for upgrade",
"handler", pu.Descriptor.Handler,
"err", err,
logging.LogEvent, api.LogEventIncompatibleBinary,
)
return err
}
// Ensure the upgrade handler exists.
if _, err = migrations.GetHandler(pu.Descriptor.Handler); err != nil {
u.logger.Error("error getting migration handler for upgrade",
"handler", pu.Descriptor.Handler,
"err", err,
)
return err
}
}
if err = u.flushDescriptorLocked(); err != nil {
return err
}
u.logger.Info("loaded pending upgrade metadata",
"pending", u.pending,
)
return nil
}
// NOTE: Assumes lock is held.
func (u *upgradeManager) flushDescriptorLocked() error {
// Delete the state if there's no pending upgrades.
if len(u.pending) == 0 {
if err := u.store.Delete(metadataStoreKey); err != persistent.ErrNotFound {
return err
}
return nil
}
// Otherwise go over pending upgrades and check if any are completed.
var pending []*api.PendingUpgrade
for _, pu := range u.pending {
if pu.IsCompleted() {
u.logger.Info("upgrade completed, removing state",
"handler", pu.Descriptor.Handler,
)
continue
}
pending = append(pending, pu)
}
u.pending = pending
return u.store.PutCBOR(metadataStoreKey, u.pending)
}
// Implements api.Backend.
func (u *upgradeManager) StartupUpgrade() error {
u.Lock()
defer u.Unlock()
for _, pu := range u.pending {
if pu.UpgradeHeight == api.InvalidUpgradeHeight {
continue
}
if pu.HasStage(api.UpgradeStageStartup) {
u.logger.Warn("startup upgrade already performed, skipping",
"handler", pu.Descriptor.Handler,
)
continue
}
// Execute the statup stage.
u.logger.Warn("performing startup upgrade",
"handler", pu.Descriptor.Handler,
logging.LogEvent, api.LogEventStartupUpgrade,
)
migrationCtx := migrations.NewContext(pu, u.dataDir)
handler, err := migrations.GetHandler(pu.Descriptor.Handler)
if err != nil {
return err
}
if err := handler.StartupUpgrade(migrationCtx); err != nil {
return err
}
pu.PushStage(api.UpgradeStageStartup)
}
return u.flushDescriptorLocked()
}
// Implements api.Backend.
func (u *upgradeManager) ConsensusUpgrade(privateCtx interface{}, currentEpoch beacon.EpochTime, currentHeight int64) error {
u.Lock()
defer u.Unlock()
for _, pu := range u.pending {
// If we haven't reached the upgrade epoch yet, we run normally;
// startup made sure we're an appropriate binary for that.
if pu.UpgradeHeight == api.InvalidUpgradeHeight {
if currentEpoch < pu.Descriptor.Epoch {
return nil
}
pu.UpgradeHeight = currentHeight
if err := u.flushDescriptorLocked(); err != nil {
return err
}
return api.ErrStopForUpgrade
}
// If we're already past the upgrade height, then everything must be complete.
if pu.UpgradeHeight < currentHeight {
pu.PushStage(api.UpgradeStageConsensus)
continue
}
if pu.UpgradeHeight > currentHeight {
panic("consensus upgrade: UpgradeHeight is in the future but upgrade epoch seen already")
}
if !pu.HasStage(api.UpgradeStageConsensus) {
u.logger.Warn("performing consensus upgrade",
"handler", pu.Descriptor.Handler,
logging.LogEvent, api.LogEventConsensusUpgrade,
)
migrationCtx := migrations.NewContext(pu, u.dataDir)
handler, err := migrations.GetHandler(pu.Descriptor.Handler)
if err != nil {
return err
}
if err := handler.ConsensusUpgrade(migrationCtx, privateCtx); err != nil {
return err
}
}
}
return u.flushDescriptorLocked()
}
// Implements api.Backend.
func (u *upgradeManager) Close() {
u.Lock()
defer u.Unlock()
_ = u.flushDescriptorLocked()
u.store.Close()
}
// New constructs and returns a new upgrade manager. It also checks for and loads any
// pending upgrade descriptors; if this node is not the one intended to be run according
// to the loaded descriptor, New will return an error.
func New(store *persistent.CommonStore, dataDir string) (api.Backend, error) {
svcStore, err := store.GetServiceStore(api.ModuleName)
if err != nil {
return nil, err
}
upgrader := &upgradeManager{
store: svcStore,
dataDir: dataDir,
logger: logging.GetLogger(api.ModuleName),
}
if err := upgrader.checkStatus(); err != nil {
return nil, err
}
return upgrader, nil
}
|
import { ChangeDetectionStrategy } from '@angular/core';
import { Component, Input } from '@angular/core';
import { Country } from '@api/custom/country';
import { NetworkType } from '@api/custom/network-type';
import { Subset } from '@api/custom/subset';
import { Stat } from '../domain/stat';
@Component({
selector: 'kpn-overview-list-stat-row',
changeDetection: ChangeDetectionStrategy.OnPush,
template: `
<tr>
<td *ngIf="rowspan" [rowSpan]="rowspan">
<kpn-country-name [country]="country"></kpn-country-name>
</td>
<td>
<kpn-network-type-icon
[networkType]="networkType"
></kpn-network-type-icon>
</td>
<td class="value">
<kpn-overview-value
[stat]="stat"
[subset]="subset(country, networkType)"
></kpn-overview-value>
</td>
</tr>
`,
styles: [
`
:host {
display: contents;
}
.value {
text-align: right;
vertical-align: middle;
width: 3.5em;
}
`,
],
})
export class OverviewListStatRowComponent {
@Input() rowspan: number = null;
@Input() country: Country;
@Input() networkType: NetworkType;
@Input() stat: Stat;
subset(country: Country, networkType: NetworkType): Subset {
return { country, networkType };
}
}
|
import Data.Bits
import Data.List
import Data.Map (Map)
import qualified Data.Map as Map
import Data.Tuple
type Registers = [Int]
type Op = Registers -> Int -> Int -> Int -> Registers
type Instruction = (Int, Int, Int, Int)
get :: Int -> Registers -> Int
get 0 [value, _, _, _] = value
get 1 [_, value, _, _] = value
get 2 [_, _, value, _] = value
get 3 [_, _, _, value] = value
set :: Int -> Registers -> Int -> Registers
set 0 [_, _1, _2, _3] value = [value, _1, _2, _3]
set 1 [_0, _, _2, _3] value = [_0, value, _2, _3]
set 2 [_0, _1, _, _3] value = [_0, _1, value, _3]
set 3 [_0, _1, _2, _] value = [_0, _1, _2, value]
op :: (Int -> Registers -> Int) -> (Int -> Registers -> Int) -> (Int -> Int -> Int) -> Op
op getA getB fun registers a b c = set c registers (fun (getA a registers) (getB b registers))
addr :: Op
addr = op get get (+)
addi :: Op
addi = op get (const . id) (+)
mulr :: Op
mulr = op get get (*)
muli :: Op
muli = op get (const . id) (*)
banr :: Op
banr = op get get (.&.)
bani :: Op
bani = op get (const . id) (.&.)
borr :: Op
borr = op get get (.|.)
bori :: Op
bori = op get (const . id) (.|.)
setr :: Op
setr = op get get const
seti :: Op
seti = op (const . id) get const
gtir :: Op
gtir = op (const . id) get (\x y -> fromEnum $ x > y)
gtri :: Op
gtri = op get (const . id) (\x y -> fromEnum $ x > y)
gtrr :: Op
gtrr = op get get (\x y -> fromEnum $ x > y)
eqir :: Op
eqir = op (const . id) get (\x y -> fromEnum $ x == y)
eqri :: Op
eqri = op get (const . id) (\x y -> fromEnum $ x == y)
eqrr :: Op
eqrr = op get get (\x y -> fromEnum $ x == y)
opcodes :: [Op]
opcodes = [addr, addi, mulr, muli, banr, bani, borr, bori, setr, seti, gtir, gtri, gtrr, eqir, eqri, eqrr]
execute :: [Op] -> Registers -> Instruction -> Registers
execute ops registers (op, a, b, c) = (ops !! op) registers a b c
parseInstruction :: String -> Instruction
parseInstruction instruction =
let [op, a, b, c] = map read $ words $ instruction
in (op, a, b, c)
parse :: [String] -> ([(Registers, Instruction, Registers)], [Instruction])
parse ("" : "" : xs) = ([], map parseInstruction xs)
parse (('B':'e':'f':'o':'r':'e':':':' ':before) : instruction : ('A':'f':'t':'e':'r':':':' ':' ':after) : "" : xs) =
let (samples, program) = parse xs
in ((read before, parseInstruction instruction, read after) : samples, program)
ops :: [(Registers, Instruction, Registers)] -> Map Int Int -> [Op]
ops samples mapping =
if Map.size mapping == length opcodes
then map (\(_, i) -> opcodes !! i) $ sort $ map swap $ Map.toList mapping
else ops samples $ foldl matching mapping samples
where matching mapping (before, (op, a, b, c), after) =
case filter (\(i, op) -> op before a b c == after && i `Map.notMember` mapping) $ zip [0..] opcodes of
[(i, _)] -> Map.insert i op mapping
_ -> mapping
part1 :: String -> Int
part1 = length . filter (>= 3) . map matching . fst . parse . lines
where matching (before, (_, a, b, c), after) = length $ filter (\op -> op before a b c == after) opcodes
part2 :: String -> Int
part2 input =
let (samples, program) = parse $ lines input
in get 0 $ foldl (execute $ ops samples Map.empty) [0,0,0,0] program
|
package fr.openwide.core.wicket.more.markup.html.factory;
/**
* @deprecated Use {@link AbstractDetachableFactory AbstractDetachableFactory<T, Condition>} instead.
*/
@Deprecated
public abstract class AbstractOneParameterConditionFactory<T> implements IOneParameterConditionFactory<T> {
private static final long serialVersionUID = 712071199355219097L;
@Override
public void detach() {
// nothing to do
}
}
|
<?php
namespace App\Http\Controllers;
use App\Http\Requests\VoteRequest;
use Illuminate\Http\Request;
use App\Models;
use Illuminate\Support\Facades\Auth;
class HomeController extends Controller
{
/**
* Create a new controller instance.
*
* @return void
*/
public function __construct()
{
$this->middleware('auth');
}
/**
* Show the application dashboard.
*
* @return \Illuminate\Http\Response
*/
public function index()
{
$user = Models\Users::find(Auth::user()->id);
$refreshTime = getenv('VOTE_ACTIVE_TIME');
$halfHour = date('U') - (60*$refreshTime);
$votes = Models\PollVotes::where("update_time", ">", $halfHour)->get();
$result['a'] = 0;
$result['b'] = 0;
foreach ($votes as $line){
$result[$line->vote] ++;
}
return view('dashboard', compact('user', 'result'));
}
public function vote(VoteRequest $request)
{
$time = date('U');
$vote = $request->getVote();
return Models\PollVotes::updateOrCreate(
['id_user' => Auth::id()],
['vote' => $vote, 'update_time' => $time]
);
}
public function getVote()
{
$refreshTime = getenv('VOTE_ACTIVE_TIME');
$halfHour = date('U') - (60*$refreshTime);
$votes = Models\PollVotes::where("update_time", ">", $halfHour)->get();
$result['a'] = 0;
$result['b'] = 0;
foreach ($votes as $line){
if($line->update_time)
$result[$line->vote] ++;
}
return json_encode([$result['a'], $result['b']]);
}
}
|
package cn.practice.code.priciple.task1;
import java.text.ParseException;
import java.util.Arrays;
import java.util.List;
/**
* @author pi
*/
public class CalendarWeekHtmlPrinter extends CalendarWeekPrinter{
private String returnHtmlFormatWeekNameString() {
StringBuffer stringBuffer = new StringBuffer();
stringBuffer.append("<thread> \n <tr> \n");
String[] weekName = returnWeekNameArray();
List<String> weekNameList = Arrays.asList(weekName);
weekNameList.forEach(week -> {
stringBuffer.append("<td>");
stringBuffer.append(week);
stringBuffer.append("</td> \n");
});
stringBuffer.append("</tr> \n </thread> \n");
return stringBuffer.toString();
}
private String returnHtmlFormatDateString(String inputDateStr) throws ParseException {
StringBuffer stringBuffer = new StringBuffer();
int[] weekCalendars = getWeekCalendars(inputDateStr);
stringBuffer.append("<tbody> \n <tr> \n");
for (int i = 0; i < weekCalendars.length; i++) {
stringBuffer.append("<td>");
stringBuffer.append(weekCalendars[i]);
stringBuffer.append("</td> \n");
}
stringBuffer.append("</tr> \n </tbody> \n");
return stringBuffer.toString();
}
@Override
public void print(String inputDateString) throws ParseException {
StringBuffer stringBuffer = new StringBuffer();
stringBuffer.append("<table> \n");
stringBuffer.append(returnHtmlFormatWeekNameString());
stringBuffer.append(returnHtmlFormatDateString(inputDateString));
stringBuffer.append("</table> \n");
System.out.println(stringBuffer.toString());
}
}
|
const knex = require('knex');
const knexConfigy = require('../knexfile')
// remove the process.env, jest has issues testing env files
// tests work fine once dbEngine = 'development'
const dbEngine = process.env.NODE_ENV || 'development';
module.exports = knex(knexConfigy[dbEngine]);
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.