message
stringlengths 13
484
| diff
stringlengths 38
4.63k
|
---|---|
Import kExternalLink into content-unavailable-page
Fixes | <script>
+ import kExternalLink from 'kolibri.coreVue.components.kExternalLink';
+
export default {
name: 'learnContentUnavailable',
+ components: {
+ kExternalLink,
+ },
$trs: {
header: 'No content channels available',
adminLink: 'You can import content from the Content page if you have the proper permissions',
|
Changed "man" to "woman" in ct-waterbury-1
Also added tag to ct-waterbury-3 | @@ -28,7 +28,7 @@ id: ct-waterbury-2
A group of people are protesting outside Waterbury police headquarters around 5:30PM. Police are lined up blocking the roadway; they warn protesters to clear the roadway. Police declare the assembly unlawful, apparently in response to the protesters standing in the roadway. Police again warn protesters to clear the roadway, and say that anyone in the roadway will be arrested. A woman steps out into the street, walking slowly across holding a sign. Police tell her to get out of the roadway; she continues across to the other side. Police move forward and arrest numerous protesters, several of whom do not appear to have been in the roadway to begin with. Several protesters are forced to the ground, including one who is slammed against a police car (see second link). Later in the same incident (around 6:30 in the first link), police aggressively and without apparent reason arrest another protester. An LRAD is mentioned, but apparently not deployed. Waterbury authorities say no property was damaged during the protest. Arrested protesters were not charged. (Note: this is a different incident from ct-waterbury-2 -- see street view.)
-tags: arrest, tackle
+tags: abuse-of-power, arrest, tackle
id: ct-waterbury-3
@@ -44,7 +44,7 @@ id: ct-waterbury-3
* [Protesters will not be charged](https://www.courant.com/breaking-news/hc-br-waterbury-protester-charges-dropped-20200618-33u442hdwfbu7pnb7zbdsdivaa-story.html)
-### Officer forces man into car and wrestles with another | June 3rd
+### Officer forces woman into car and wrestles with another | June 3rd
Officer forces woman into cop car and wrestles with another woman on her phone.
|
codegen: mark callee kernels as static
They don't need to be visible outside of the single compilation unit,
which will help the C compiler a bit. | @@ -579,9 +579,13 @@ class CASTBuilder(ASTBuilderBase):
if self.target.fortran_abi:
name += "_"
+ if codegen_state.kernel.is_called_from_host:
+ name = Value("void", name)
+ else:
+ name = Value("static void", name)
return FunctionDeclarationWrapper(
FunctionDeclaration(
- Value("void", name),
+ name,
[self.idi_to_cgen_declarator(codegen_state.kernel, idi)
for idi in codegen_state.implemented_data_info]))
|
Increase timeout for Cloud Hypervisor tests
Increase timeout for Cloud Hypervisor tests since a lot of runs are
failing due to timeout errors. | @@ -22,7 +22,7 @@ class CloudHypervisorTestResult:
class CloudHypervisorTests(Tool):
- TIME_OUT = 3600
+ TIME_OUT = 7200
repo = "https://github.com/cloud-hypervisor/cloud-hypervisor.git"
|
Update README.md
add second react boilerplate (using create-react-app) | @@ -6,7 +6,9 @@ Below there are a couple of non-trivial examples that demonstrate an application
## React Boilerplate
-[React boilerplate](https://github.com/r0x0r/pywebview-react-boilerplate). A complete React-based boilerplate with installation, usage and building taken care of out of the box.
+[React boilerplate with parcel-bundler](https://github.com/r0x0r/pywebview-react-boilerplate). A complete React-based boilerplate with installation, usage and building taken care of out of the box.
+
+[React boilerplate with create-react-app](https://github.com/dzc0d3r/pywebview-react-boilerplate/). A complete React-based boilerplate with installation, usage and building taken care of out of the box.
## Serverless application
|
Removed special handling of emacs; it's not needed and wasn't working.
This was code copied from an ancient project of mine from a few years ago. | from __future__ import print_function
import cmd
import textwrap
-import os
import sys
import re
import pdb
@@ -27,12 +26,8 @@ class DebuggerCli(cmd.Cmd, object):
self.listener = listener
self.builtin = BuiltIn()
- if os.getenv("EMACS", False):
- # Running a shell inside of emacs sets this environment
- # variable. Handy, because when running in a shell inside emacs
- # we need to turn off raw input
- self.use_rawinput = False
+ # some simple aliases.
self.do_c = self.do_continue
self.do_l = self.do_locate_elements
self.do_s = self.do_step
|
appimage: update package (libc) in dockerfile
Ubuntu no longer serves old version | @@ -49,9 +49,9 @@ RUN apt-get update -q && \
libxcb-util1=0.4.0-0ubuntu3 \
libxcb-render-util0=0.3.9-1 \
libx11-xcb1=2:1.6.4-3ubuntu0.4 \
- libc6-dev=2.27-3ubuntu1.4 \
- libc6=2.27-3ubuntu1.4 \
- libc-dev-bin=2.27-3ubuntu1.4 \
+ libc6-dev=2.27-3ubuntu1.5 \
+ libc6=2.27-3ubuntu1.5 \
+ libc-dev-bin=2.27-3ubuntu1.5 \
&& \
rm -rf /var/lib/apt/lists/* && \
apt-get autoremove -y && \
|
Removed tear-gas tag
tear-gas was not **confirmed** so we will remove the tag to avoid undo speculation. | @@ -6,7 +6,7 @@ On June 27th, citizens of Aurora held a violin vigil in honor of Elijah McClain.
Aurora police arrived in riot gear to disperse the vigil. When participants refused, riot police beat them with batons and pepper-sprayed them.
-tags: strike, beat, baton, pepper-spray, spray, tear-gas
+tags: strike, beat, baton, pepper-spray, spray
id: co-aurora-1
|
Added ipi_deployment_required decorator to test
test_simultaneous_drain_of_two_ocs_nodes | @@ -16,7 +16,8 @@ from ocs_ci.ocs.node import (
)
from ocs_ci.framework.testlib import (
tier1, tier2, tier3, tier4, tier4b,
- ManageTest, aws_platform_required, ignore_leftovers
+ ManageTest, aws_platform_required, ignore_leftovers,
+ ipi_deployment_required
)
from tests.sanity_helpers import Sanity
@@ -252,6 +253,7 @@ class TestNodesMaintenance(ManageTest):
@tier4
@tier4b
@aws_platform_required
+ @ipi_deployment_required
@pytest.mark.parametrize(
argnames=["interface"],
argvalues=[
|
Changes to ECO dataset_converter
from_tuple now returns a list
Closes | @@ -120,9 +120,9 @@ def convert_eco(dataset_loc, hdf_filename, timezone):
13+phase:('phase_angle', ''),
'Q': ('power', 'reactive'),
}
- df_phase.columns = pd.MultiIndex.from_tuples(
+ df_phase.columns = pd.MultiIndex.from_tuples([
sm_column_name[col] for col in df_phase.columns
- )
+ ])
power_active = df_phase['power', 'active']
tmp_before = np.size(power_active)
|
Remove chief-james.
The Chief deployment system is not used anymore. | @@ -6,8 +6,6 @@ aspy.yaml==0.2.1 \
cached-property==1.2.0 \
--hash=sha256:e3081a8182d3d4b7283eeade76c382bcfd4dfd644ca800598229c2ef798abb53 \
--hash=sha256:c3a5e7f41fe99991ab1f8df294f3d8180a56b46fedec94ebd54def89d0ec09c4
-chief-james==1.0.0 \
- --hash=sha256:c9f4bb8280c0c65f20a9dfc841157fcb479461ecae6972a8a33363e7693aa917
django-debug-toolbar==1.4.0 \
--hash=sha256:852a37b80df9597048591ebc87d0ce85a4edceaef015dc5360ad89cc5960c27b
django-sslserver==0.16 \
|
Change SyncManager behaviour when used as a context decorator
Resolves | @@ -37,7 +37,7 @@ class BaseManager(ContextManager[BaseManager]):
def start(self, initializer: Optional[Callable[..., Any]] = ...,
initargs: Iterable[Any] = ...) -> None: ...
-class SyncManager(BaseManager):
+class SyncManager(BaseManager, ContextManager[SyncManager]):
def BoundedSemaphore(self, value: Any = ...) -> threading.BoundedSemaphore: ...
def Condition(self, lock: Any = ...) -> threading.Condition: ...
def Event(self) -> threading.Event: ...
|
Docs: Small guess-work tweaks for the docs deploy script
We should probably switch to `doctr` soon. | @@ -45,14 +45,14 @@ git config user.name "Travis CI"
git config user.email "$COMMIT_AUTHOR_EMAIL"
# If there are no changes to the compiled out (e.g. this is a README update) then just bail.
-if [ -z `git diff --exit-code` ]; then
+if git diff --quiet; then
echo "No changes to the output on this push; exiting."
exit 0
fi
# Commit the "changes", i.e. the new version.
# The delta will show diffs between new and old versions.
-git add .
+git add --ignore-removal .
git commit -m "Deploy to GitHub Pages: ${SHA}"
# Get the deploy key by using Travis's stored variables to decrypt deploy_key.enc
|
Do not check number of files under binpath
Runpath cleanup may fail so do not require it to succeed
for this test to pass. | @@ -133,12 +133,10 @@ def test_binary_copy():
app = CustomApp(path_cleanup=True, **params)
with app:
# Will terminate the binary.
- assert len(os.listdir(app.binpath)) == 1
assert app.extracts['value'] == 'started'
app = CustomApp(path_cleanup=False, **params)
with app:
- assert len(os.listdir(app.binpath)) == 2
assert app.extracts['value'] == 'started'
|
Rally: decrease SLA for avg list of ports and nets
In a normal gate run these are returning in 2 seconds each
on average. Let's reduce the SLA of these from 15 to 5 now
to help prevent future performance regressions in this area. | network: 119
sla:
max_avg_duration_per_atomic:
- neutron.list_networks: 15 # reduce as perf is fixed
+ neutron.list_networks: 5 # reduce as perf is fixed
failure_rate:
max: 0
ports_per_network: 50
runner:
type: "constant"
- times: 8
+ times: 4
concurrency: 4
context:
users:
router: -1
# ((ports per net + 1 dhcp) * times) + (concurrency-1)
# see bug/1623390 for concurrency explanation
- port: 811
+ port: 207
sla:
max_avg_duration_per_atomic:
- neutron.list_ports: 15 # reduce as perf is fixed
+ neutron.list_ports: 5 # reduce as perf is fixed
failure_rate:
max: 0
NeutronTrunks.create_and_list_trunk_subports:
-
args:
- subport_count: 250
+ subport_count: 125
runner:
type: "constant"
times: 1
|
Fixed issue in normalizations
not normalizing next_state passed to agent
thanks to Manuel | @@ -192,7 +192,8 @@ class Core(object):
self._episode_steps < self.mdp.info.horizon and not absorbing)
state = self._state
- self._state = self._preprocess(next_state.copy())
+ next_state = self._preprocess(next_state.copy())
+ self._state = next_state
return state, action, reward, next_state, absorbing, last
|
Added test for absolute momentum
New test makes sure that absolute_momentum() works when using units from the xarray DataArray `units` attribute. | @@ -11,6 +11,7 @@ from metpy.calc import (absolute_momentum, cross_section_components, normal_comp
tangential_component, unit_vectors_from_cross_section)
from metpy.calc.cross_sections import (distances_from_cross_section,
latitude_from_cross_section)
+from metpy.cbook import get_test_data
from metpy.interpolate import cross_section
from metpy.testing import assert_array_almost_equal, assert_xarray_allclose, needs_cartopy
from metpy.units import units
@@ -321,3 +322,23 @@ def test_absolute_momentum_given_xy(test_cross_xy):
coords=test_cross_xy['u_wind'].coords,
dims=test_cross_xy['u_wind'].dims)
assert_xarray_allclose(momentum, true_momentum)
+
+
+def test_absolute_momentum_xarray_units_attr():
+ """Test absolute momentum when `u` and `v` are DataArrays with a `units` attribute."""
+ data = xr.open_dataset(get_test_data('narr_example.nc', False))
+ data = data.metpy.parse_cf().squeeze()
+
+ start = (37.0, -105.0)
+ end = (35.5, -65.0)
+ cross = cross_section(data, start, end)
+
+ u = cross['u_wind'][0].sel(index=slice(0, 2))
+ v = cross['v_wind'][0].sel(index=slice(0, 2))
+
+ momentum = absolute_momentum(u, v)
+ true_momentum_values = np.array([137.46164031, 134.11450232, 133.85196023])
+ true_momentum = xr.DataArray(units.Quantity(true_momentum_values, 'm/s'),
+ coords=u.coords)
+
+ assert_xarray_allclose(momentum, true_momentum)
|
Integrate filters with knn queries in OpenDistroElasticsearchDocumentStore
* Integrate filters with knn queries in ODFE
Allows the use of filters coupled with knn similarity search on
OpenDistroElasticsearchDocumentStore instances. Fixes
* Satisfy mypy | @@ -1039,6 +1039,81 @@ class OpenSearchDocumentStore(ElasticsearchDocumentStore):
**kwargs)
+ def query_by_embedding(self,
+ query_emb: np.ndarray,
+ filters: Optional[Dict[str, List[str]]] = None,
+ top_k: int = 10,
+ index: Optional[str] = None,
+ return_embedding: Optional[bool] = None) -> List[Document]:
+ """
+ Find the document that is most similar to the provided `query_emb` by using a vector similarity metric.
+
+ :param query_emb: Embedding of the query (e.g. gathered from DPR)
+ :param filters: Optional filters to narrow down the search space.
+ Example: {"name": ["some", "more"], "category": ["only_one"]}
+ :param top_k: How many documents to return
+ :param index: Index name for storing the docs and metadata
+ :param return_embedding: To return document embedding
+ :return:
+ """
+ if index is None:
+ index = self.index
+
+ if return_embedding is None:
+ return_embedding = self.return_embedding
+
+ if not self.embedding_field:
+ raise RuntimeError("Please specify arg `embedding_field` in ElasticsearchDocumentStore()")
+ else:
+ # +1 in similarity to avoid negative numbers (for cosine sim)
+ body = {
+ "size": top_k,
+ "query": {
+ "bool": {
+ "must": [
+ self._get_vector_similarity_query(query_emb, top_k)
+ ]
+ }
+ }
+ }
+ if filters:
+ filter_clause = []
+ for key, values in filters.items():
+ if type(values) != list:
+ raise ValueError(
+ f'Wrong filter format for key "{key}": Please provide a list of allowed values for each key. '
+ 'Example: {"name": ["some", "more"], "category": ["only_one"]} ')
+ filter_clause.append(
+ {
+ "terms": {key: values}
+ }
+ )
+ body["query"]["bool"]["filter"] = filter_clause # type: ignore
+
+ excluded_meta_data: Optional[list] = None
+
+ if self.excluded_meta_data:
+ excluded_meta_data = deepcopy(self.excluded_meta_data)
+
+ if return_embedding is True and self.embedding_field in excluded_meta_data:
+ excluded_meta_data.remove(self.embedding_field)
+ elif return_embedding is False and self.embedding_field not in excluded_meta_data:
+ excluded_meta_data.append(self.embedding_field)
+ elif return_embedding is False:
+ excluded_meta_data = [self.embedding_field]
+
+ if excluded_meta_data:
+ body["_source"] = {"excludes": excluded_meta_data}
+
+ logger.debug(f"Retriever query: {body}")
+ result = self.client.search(index=index, body=body, request_timeout=300)["hits"]["hits"]
+
+ documents = [
+ self._convert_es_hit_to_document(hit, adapt_score_for_embedding=True, return_embedding=return_embedding)
+ for hit in result
+ ]
+ return documents
+
def _create_document_index(self, index_name: str):
"""
Create a new index for storing documents.
|
minor: Convert `unicode_emoji_regex` to uppercase.
Following the convention, we use uppercase for
regex. Also, `unicode_emoji_regex` is given a
conventional name ending with `*_RE`: `UNICODE_EMOJI_RE`. | @@ -909,7 +909,7 @@ class InlineInterestingLinkProcessor(markdown.treeprocessors.Treeprocessor):
This works by using the URLs, user_mentions and media data from
the twitter API and searching for Unicode emojis in the text using
- `unicode_emoji_regex`.
+ `UNICODE_EMOJI_RE`.
The first step is finding the locations of the URLs, mentions, media and
emoji in the text. For each match we build a dictionary with type, the start
@@ -968,7 +968,7 @@ class InlineInterestingLinkProcessor(markdown.treeprocessors.Treeprocessor):
}
)
# Build dicts for emojis
- for match in re.finditer(unicode_emoji_regex, text, re.IGNORECASE):
+ for match in re.finditer(UNICODE_EMOJI_RE, text, re.IGNORECASE):
orig_syntax = match.group("syntax")
codepoint = unicode_emoji_to_codepoint(orig_syntax)
if codepoint in codepoint_to_name:
@@ -1417,7 +1417,7 @@ class Timestamp(markdown.inlinepatterns.Pattern):
# \u2b00-\u2bff - Miscellaneous Symbols and Arrows
# \u3000-\u303f - CJK Symbols and Punctuation
# \u3200-\u32ff - Enclosed CJK Letters and Months
-unicode_emoji_regex = (
+UNICODE_EMOJI_RE = (
"(?P<syntax>["
"\U0001F100-\U0001F64F"
"\U0001F680-\U0001F6FF"
@@ -2244,7 +2244,7 @@ class Markdown(markdown.Markdown):
reg.register(Emoji(EMOJI_REGEX, self), "emoji", 15)
reg.register(EmoticonTranslation(EMOTICON_RE, self), "translate_emoticons", 10)
# We get priority 5 from 'nl2br' extension
- reg.register(UnicodeEmoji(unicode_emoji_regex), "unicodeemoji", 0)
+ reg.register(UnicodeEmoji(UNICODE_EMOJI_RE), "unicodeemoji", 0)
return reg
def register_linkifiers(self, inlinePatterns: markdown.util.Registry) -> markdown.util.Registry:
|
add verify_dtypes - ensure unstructured annotation types are permissible
addressing | @@ -55,6 +55,25 @@ def show_proportions(adata, layers=["spliced", "unspliced", "ambigious"], use_ra
print("Abundance of " + str(layers_keys) + ": " + str(np.round(mean_abundances, 2)))
+def verify_dtypes(adata):
+ try:
+ _ = adata[:, 0]
+ except:
+ uns = adata.uns
+ adata.uns = {}
+ try:
+ _ = adata[:, 0]
+ logg.warn(
+ "Safely deleted unstructured annotations (adata.uns), \n"
+ "as these do not comply with permissible anndata datatypes."
+ )
+ except:
+ logg.warn(
+ "The data might be corrupted. Please verify all annotation datatypes."
+ )
+ adata.uns = uns
+
+
def cleanup(data, clean="layers", keep=None, copy=False):
"""Deletes attributes not needed.
@@ -74,6 +93,7 @@ def cleanup(data, clean="layers", keep=None, copy=False):
Returns or updates `adata` with selection of attributes kept.
"""
adata = data.copy() if copy else data
+ verify_dtypes(adata)
keep = list([keep] if isinstance(keep, str) else {} if keep is None else keep)
keep.extend(["spliced", "unspliced", "Ms", "Mu", "clusters", "neighbors"])
@@ -102,6 +122,7 @@ def get_size(adata, layer=None):
def set_initial_size(adata, layers={"spliced", "unspliced"}):
+ verify_dtypes(adata)
layers = [
layer
for layer in layers
|
Update marketplace-flow.md
Clarify to users that Ocean Market step doesn't affect the steps that follow. Make it easier to create & fill config.ini. Better subtitles in related sections. | @@ -58,6 +58,8 @@ npm start
Check out the Ocean Market webapp at http://localhost:8000.
+Ocean Market is a graphical interface to the backend smart contracts and Ocean services (Aquarius, Provider). The following steps will interface to the backend in a different fashion: using the command-line / console, and won't need Ocean Market. But it's good to understand there are multiple views.
+
### Install the library
In a new console that we'll call the _work_ console (as we'll use it later):
@@ -76,11 +78,13 @@ pip install wheel
pip install ocean-lib
```
-### Set up contracts
+### Create config file
-Create a file called `test3/config.ini` and fill it as follows.
+In the work console:
-```text
+```console
+#Create config.ini file and fill it with configuration info
+echo """
[eth-network]
network = http://127.0.0.1:8545
address.file = ~/.ocean/ocean-contracts/artifacts/address.json
@@ -91,8 +95,11 @@ provider.url = http://localhost:8030
provider.address = 0x00bd138abd70e2f00903268f3db08f2d25677c9e
downloads.path = consume-downloads
+""" > config.ini
```
+### Set envvars
+
In the work console:
```console
#set private keys of two accounts
@@ -108,6 +115,11 @@ python
## 2. Alice publishes data asset
+In the work console:
+```console
+python
+```
+
In the Python console:
```python
#create ocean instance
|
Update index.rst
Google+ is no longer a valid social media community link and should be removed. | @@ -173,7 +173,6 @@ Other community links
- `Salt Stack Inc. <http://www.saltstack.com>`_
- `Subreddit <http://www.reddit.com/r/saltstack>`_
-- `Google+ <https://plus.google.com/114449193225626631691/posts>`_
- `YouTube <http://www.youtube.com/user/SaltStack>`_
- `Facebook <https://www.facebook.com/SaltStack>`_
- `Twitter <https://twitter.com/SaltStackInc>`_
|
Remove signal that is never used and never emitted
(we use .valueChanged instead) | @@ -229,8 +229,6 @@ class HorizontalSliderWithBrowser(qt.QAbstractSlider):
:param QWidget parent: Optional parent widget
"""
- sigIndexChanged = qt.pyqtSignal(object)
-
def __init__(self, parent=None):
qt.QAbstractSlider.__init__(self, parent)
self.setOrientation(qt.Qt.Horizontal)
|
Do not install Cython explicitly
This is now taken care of in pyproject.toml | @@ -36,7 +36,6 @@ rm /opt/python/cp27*
PYBINS="/opt/python/*/bin"
HAS_CYTHON=0
for PYBIN in ${PYBINS}; do
- ${PYBIN}/pip install Cython
# ${PYBIN}/pip install -r /io/requirements.txt
${PYBIN}/pip wheel /io/ -w wheelhouse/
done
|
Avoid a TypeError when reps is or contains a ndarray
Something along the lines of `TypeError: multiply only accepts scalar or ndarray, but got a list.` | @@ -1300,7 +1300,7 @@ def tile(a, reps):
a = reshape(a, (1,) * (len(reps) - ndim(a)) + shape(a))
reps = (1,) * (ndim(a) - len(reps)) + tuple(reps)
for i, rep in enumerate(reps):
- a = concatenate([a] * rep, axis=i)
+ a = concatenate([a] * int(rep), axis=i)
return a
@_wraps(onp.concatenate)
|
Change ellipsis place
For some reason, appveyor output is different:
array([0, 1, 2, 3, 4, 5], dtype=int32)
While travis output is:
array([0, 1, 2, 3, 4, 5]) | @@ -659,7 +659,7 @@ class Rotor(object):
>>> evalues, evectors = rotor._eigen(0, sorted_=True)
>>> idx = rotor._index(evalues)
>>> idx[:6] # doctest: +ELLIPSIS
- array([0, 1, 2, 3, 4, 5], ...
+ array([0, 1, 2, 3, 4, ...
"""
# avoid float point errors when sorting
evals_truncated = np.around(eigenvalues, decimals=10)
@@ -2232,7 +2232,7 @@ class Rotor(object):
... BearingElement(n=3, kxx=1e6, cxx=0, kyy=1e6, cyy=0, kxy=0, cxy=0, kyx=0, cyx=0)],
... w=0, nel_r=1)
>>> rotor.run_modal()
- >>> rotor.wn
+ >>> rotor.wn.round(4)
array([ 85.76341374, 85.76341374, 271.93258207, 271.93258207,
718.58003871, 718.58003871])
"""
|
Standalone: Improvements for standalone mode
* Also bump directory name of dependency cache files, so we do
make sure to take advantange. | @@ -721,9 +721,9 @@ def _getCacheFilename(is_main_executable, source_dir, original_dir, binary_filen
cache_dir = os.path.join(
getCacheDir(),
- "library_deps_pefile"
+ "library_deps2_pefile"
if Options.isExperimental("use_pefile")
- else "library_deps",
+ else "library_deps2",
)
makePath(cache_dir)
@@ -775,6 +775,12 @@ def _parseDependsExeOutput2(lines, result):
dll_filename = os.path.abspath(dll_filename)
+ dll_name = os.path.basename(dll_filename)
+
+ # Ignore this runtime DLL of Python2.
+ if dll_name in ("msvcr90.dll",):
+ continue
+
# The executable itself is of course exempted. We cannot check its path
# because depends.exe mistreats unicode paths.
if first:
@@ -799,7 +805,15 @@ def _parseDependsExeOutput(filename, result):
_parseDependsExeOutput2(getFileContentByLine(filename), result)
+_scan_dir_cache = {}
+
+
def getScanDirectories(package_name, original_dir):
+ cache_key = package_name, original_dir
+
+ if cache_key in _scan_dir_cache:
+ return _scan_dir_cache[cache_key]
+
scan_dirs = [sys.prefix]
if package_name is not None:
@@ -819,6 +833,8 @@ def getScanDirectories(package_name, original_dir):
if not os.path.isdir(path_dir):
continue
+ if areSamePaths(path_dir, os.path.join(os.environ["SYSTEMROOT"])):
+ continue
if areSamePaths(path_dir, os.path.join(os.environ["SYSTEMROOT"], "System32")):
continue
if areSamePaths(path_dir, os.path.join(os.environ["SYSTEMROOT"], "SysWOW64")):
@@ -842,6 +858,7 @@ def getScanDirectories(package_name, original_dir):
result.append(scan_dir)
+ _scan_dir_cache[cache_key] = result
return result
|
invalid return type for AsyncGenerator (Documentation)
* invalid return type for AsyncGenerator
fixed the typing for subscription AsyncGenerator, aswell as a small typo on line 22. Added short Note about AsyncGenerator Typing, with link to typing docs.
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see | @@ -12,19 +12,20 @@ This is how you define a subscription-capable resolver:
```python
import asyncio
+from typing import AsyncGenerator
import strawberry
@strawberry.type
class Query:
@strawberry.field
- def hello() -> str:
+ def hello(self) -> str:
return "world"
@strawberry.type
class Subscription:
@strawberry.subscription
- async def count(self, target: int = 100) -> int:
+ async def count(self, target: int = 100) -> AsyncGenerator[int, None]:
for i in range(target):
yield i
await asyncio.sleep(0.5)
@@ -36,6 +37,12 @@ Like queries and mutations, subscriptions are defined in a class and passed to
the Schema function. Here we create a rudimentary counting function which counts
from 0 to the target sleeping between each loop iteration.
+<Note>
+The return type of `count` is `AsyncGenerator` where the first generic
+argument is the actual type of the response, in most cases the second argument
+should be left as `None` (more about Generator typing [here](https://docs.python.org/3/library/typing.html#typing.AsyncGenerator)).
+</Note>
+
We would send the following GraphQL document to our server to subscribe to this
data stream:
|
Product didn't like empty vectors. very sad
Not pog | @@ -920,6 +920,7 @@ def prepend(vector, item):
def product(vector):
if type(vector) is Generator:
return vector._reduce(multiply)
+ if not vector: return vector
ret = vector[0]
for item in vector[1:]:
ret = multiply(ret, item)
|
langkit.parsers.creates_node: add docstring
TN: | @@ -1536,15 +1536,19 @@ class NodeToParsersPass():
def creates_node(p, follow_refs=True):
"""
- Predicate that is true on parsers that create a node directly, or are just
- a reference to one or several parsers that creates nodes, without
- additional parsing involved.
+ Return true on parsers that create a node directly, or are just a reference
+ to one or several parsers that creates nodes, without additional parsing
+ involved.
For example::
Node(..) # <- True
Or(a, b, c) # <- True if a b & c creates_node
_Row(a, b, c) # <- False
Pick(";", "lol", c) # <- False
+
+ :param Parser p: Parser to analyze.
+ :param bool follow_refs: Whether to recurse on sub-parsers and dereference
+ Defer parsers.
"""
from langkit.dsl import EnumNode
|
Added "performanceadexchange.com" to "data/StevenBlack/hosts"
Just discovered this one today while dealing with some pop-ups.
127.0.0.1 performanceadexchange.com
127.0.0.1 www.performanceadexchange.com | 127.0.0.1 sportsinteraction.com
127.0.0.1 www.sportsinteraction.com
127.0.0.1 embed.sendtonews.com
+127.0.0.1 performanceadexchange.com
+127.0.0.1 www.performanceadexchange.com
127.0.0.1 privacyassistant.net
127.0.0.1 ext.privacyassistant.net
127.0.0.1 pussysaga.com
|
Hyperlinks for app page to task pages should be on the task ID, not the app name
What you are clicking on/choosing is a task, not an app - you are already on the page for an app.
This is already how dependencies are represented in the same table: clickable task IDs | <thead>
<tr>
<th>Name</th>
- <th>Task_ID</th>
+ <th>Task ID</th>
<th>Dependencies</th>
<th>Executor</th>
<th>Started</th>
<tbody>
{% for t in task_summary %}
<tr>
- <td><a href="/workflow/{{ workflow_details['run_id'] }}/task/{{ t['task_id'] }}">{{ t.task_func_name }}</a></td>
- <td>{{ t['task_id'] }}</td>
+ <td>{{ t.task_func_name }}</td>
+ <td><a href="/workflow/{{ workflow_details['run_id'] }}/task/{{ t['task_id'] }}">{{ t['task_id'] }}</a></td>
<td>
{% if t['task_depends'] %}
{% for id in t['task_depends'].split(",") %}
|
Disable Dirichlet GPU test
Summary: broken until PyTorch supports backwards pass on GPU | @@ -232,6 +232,7 @@ class TestGridworldSAC(GridworldTestBase):
def test_sac_trainer_w_dirichlet_actor(self):
self._test_sac_trainer(constrain_action_sum=True)
- @unittest.skipIf(not torch.cuda.is_available(), "CUDA not available")
+ # TODO: Renable when PyTorch supports backwards pass in CUDA.
+ @unittest.skipIf(True or not torch.cuda.is_available(), "CUDA not available")
def test_sac_trainer_w_dirichlet_actor_gpu(self):
self._test_sac_trainer(use_gpu=True, constrain_action_sum=True)
|
buildPy2exe: Find installed NSIS through registry
Require NSIS >=3 for unicode support | ) If you get the error "ImportError: No module named zope.interface" then add an empty __init__.py file to the PYTHONDIR/Lib/site-packages/zope directory
-2) It is expected that you will have NSIS 3 NSIS from http://nsis.sourceforge.net installed to: C:\Program Files (x86)\NSIS\
+2) It is expected that you will have NSIS 3 NSIS from http://nsis.sourceforge.net installed.
'''
@@ -31,8 +31,24 @@ if missingStrings is not None and missingStrings is not "":
import warnings
warnings.warn("MISSING/UNUSED STRINGS DETECTED:\n{}".format(missingStrings))
-p = "C:\\Program Files (x86)\\NSIS\\makensis.exe" #TODO: how to move that into proper place, huh
-NSIS_COMPILE = p if os.path.isfile(p) else "makensis.exe"
+def get_nsis_path():
+ bin_name = "makensis.exe"
+ try:
+ from winreg import HKEY_LOCAL_MACHINE as HKLM
+ from winreg import KEY_READ, KEY_WOW64_32KEY, OpenKey, QueryValueEx
+ except ImportError:
+ return bin_name
+
+ try:
+ nsisreg = OpenKey(HKLM, "Software\\NSIS", 0, KEY_READ | KEY_WOW64_32KEY)
+ if QueryValueEx(nsisreg, "VersionMajor")[0] >= 3:
+ return "{}\\{}".format(QueryValueEx(nsisreg, "")[0], bin_name)
+ else:
+ raise Exception("You must install NSIS 3 or later.")
+ except WindowsError:
+ return bin_name
+NSIS_COMPILE = get_nsis_path()
+
OUT_DIR = "syncplay_v{}".format(syncplay.version)
SETUP_SCRIPT_PATH = "syncplay_setup.nsi"
NSIS_SCRIPT_TEMPLATE = r"""
|
Setting mpi4py (3.1.0) for py310
Using the oldest compatible mpi4py (3.1.0) for py-3.9 and py-3.10. | @@ -46,8 +46,7 @@ if setup_configure.mpi_enabled():
RUN_REQUIRES.append('mpi4py >=3.0.2')
SETUP_REQUIRES.append("mpi4py ==3.0.2; python_version<'3.8'")
SETUP_REQUIRES.append("mpi4py ==3.0.3; python_version=='3.8.*'")
- SETUP_REQUIRES.append("mpi4py ==3.0.3; python_version=='3.9.*'")
- SETUP_REQUIRES.append("mpi4py ==3.1.3; python_version>='3.10'")
+ SETUP_REQUIRES.append("mpi4py ==3.1.0; python_version>='3.9'")
# Set the environment variable H5PY_SETUP_REQUIRES=0 if we need to skip
# setup_requires for any reason.
|
api_docs: Fix typos in UserBase schema descriptions.
Fixes two small typos and adds backticks to a reference to an
object field. | @@ -14802,7 +14802,7 @@ components:
type: boolean
description: |
A boolean specifying whether the user is an organization owner.
- If true, is_admin will also be true.
+ If true, `is_admin` will also be true.
**Changes**: New in Zulip 3.0 (feature level 8).
is_billing_admin:
@@ -14820,8 +14820,8 @@ components:
- 400
- 600
description: |
- [Organization-level role](/help/roles-and-permissions)) of the user.
- Poosible values are:
+ [Organization-level role](/help/roles-and-permissions) of the user.
+ Possible values are:
- Organization owner => 100
- Organization administrator => 200
|
[StreamingFC] "fix" weight FIFO depth pragma, but it won't work
turns out HLS does not let us specify stream depths on top level
AXI streams, need another solution | @@ -930,7 +930,7 @@ class StreamingFCLayer_Batch(HLSCustomOp):
"#pragma HLS INTERFACE axis port=weights"
)
self.code_gen_dict["$PRAGMAS$"].append(
- "#pragma HLS stream depth=8 variable=8"
+ "#pragma HLS stream depth=8 variable=weights"
)
else:
|
revocation_notifier: mark webhook threads as daemon and add timeout
This ensures that those threads do not block on shutdown. | @@ -120,7 +120,7 @@ def notify_webhook(tosend):
for i in range(config.getint('cloud_verifier', 'max_retries')):
next_retry = retry.retry_time(exponential_backoff, interval, i, logger)
try:
- response = session.post(url, json=tosend)
+ response = session.post(url, json=tosend, timeout=5)
if response.status_code in [200, 202]:
break
@@ -134,7 +134,7 @@ def notify_webhook(tosend):
time.sleep(next_retry)
w = functools.partial(worker_webhook, tosend, url)
- t = threading.Thread(target=w)
+ t = threading.Thread(target=w, daemon=True)
t.start()
|
Fixing typos in schema description for BatchMatMul
Summary:
Pull Request resolved:
Fixing typos in the description of schema for one of the inputs for BatchMatMul operator. | @@ -129,7 +129,7 @@ from 0 to (dim0 * dim1 ...) - 1. rank(A) == rank(B) >= 2. In case of A and B bei
two diemnsional, it behaves like normal matrix multiplication.
)DOC")
.Input(0, "A", "tensor of shape (dim0, dim1 ... M, K)")
- .Input(1, "B", "tensor of shpae (dim0, dim2 ... K, N)")
+ .Input(1, "B", "tensor of shape (dim0, dim1 ... K, N)")
.Output(0, "Y", "tensor of shape (dim0, dim1 ... M, N)")
.Arg(
"trans_a",
|
Allow zero value for timestamps with TimestampField.
Fixes | @@ -4556,9 +4556,7 @@ class TimestampField(BigIntegerField):
def python_value(self, value):
if value is not None and isinstance(value, (int, float, long)):
- if value == 0:
- return
- elif self.resolution > 1:
+ if self.resolution > 1:
ticks_to_microsecond = 1000000 // self.resolution
value, ticks = divmod(value, self.resolution)
microseconds = int(ticks * ticks_to_microsecond)
|
metrics: disable metrics by default
pghoard should not send metrics if no provider is enabled. | @@ -546,7 +546,7 @@ class PGHoard:
self.log.exception("Problem with log_level: %r", self.log_level)
# Setup monitoring clients
- self.metrics = metrics.Metrics(statsd=self.config.get("statsd", {}))
+ self.metrics = metrics.Metrics(statsd=self.config.get("statsd", None))
for thread in self._get_all_threads():
thread.config = new_config
|
Remove class name in notifications for whole class started/completed-type events
Fixes | @@ -19,14 +19,14 @@ const nStrings = createTranslator('NotificationStrings', {
// started
individualStarted: `{learnerName} started '{itemName}'`,
multipleStarted: `{learnerName} and {numOthers, number} {numOthers, plural, one {other} other {others}} started '{itemName}'`,
- wholeClassStarted: `Everyone in '{className}' started '{itemName}'`,
+ wholeClassStarted: `Everyone started '{itemName}'`,
wholeGroupStarted: `Everyone in '{groupName}' started '{itemName}'`,
everyoneStarted: `Everyone started '{itemName}'`,
// completed
individualCompleted: `{learnerName} completed '{itemName}'`,
multipleCompleted: `{learnerName} and {numOthers, number} {numOthers, plural, one {other} other {others}} completed '{itemName}'`,
- wholeClassCompleted: `Everyone in '{className}' completed '{itemName}'`,
+ wholeClassCompleted: `Everyone completed '{itemName}'`,
wholeGroupCompleted: `Everyone in '{groupName}' completed '{itemName}'`,
everyoneCompleted: `Everyone completed '{itemName}'`,
|
use p4a --add-source instead of manual copy
Currently, android.add_src does not work anymore.
Using --add-source from p4a make it work again. | @@ -918,6 +918,12 @@ class TargetAndroid(Target):
cmd.append('--blacklist')
cmd.append(realpath(expanduser(blacklist_src)))
+ # support for java directory
+ javadirs = self.buildozer.config.getlist('app', 'android.add_src', [])
+ for javadir in javadirs:
+ cmd.append('--add-source')
+ cmd.append(realpath(expanduser(javadir)))
+
# support for aars
aars = self.buildozer.config.getlist('app', 'android.add_aars', [])
for aar in aars:
@@ -1112,9 +1118,6 @@ class TargetAndroid(Target):
# update the project.properties libraries references
self._update_libraries_references(dist_dir)
- # add src files
- self._add_java_src(dist_dir)
-
# generate the whitelist if needed
self._generate_whitelist(dist_dir)
@@ -1399,26 +1402,6 @@ class TargetAndroid(Target):
self.buildozer.debug('project.properties updated')
- def _add_java_src(self, dist_dir):
- java_src = self.buildozer.config.getlist('app', 'android.add_src', [])
-
- gradle_files = ["build.gradle", "gradle", "gradlew"]
- is_gradle_build = any((
- exists(join(dist_dir, x)) for x in gradle_files))
- if is_gradle_build:
- src_dir = join(dist_dir, "src", "main", "java")
- self.buildozer.info(
- "Gradle project detected, copy files {}".format(src_dir))
- else:
- src_dir = join(dist_dir, 'src')
- self.buildozer.info(
- "Ant project detected, copy files in {}".format(src_dir))
-
- for pattern in java_src:
- for fn in glob(expanduser(pattern.strip())):
- last_component = basename(fn)
- self.buildozer.file_copytree(fn, join(src_dir, last_component))
-
@property
def serials(self):
if hasattr(self, '_serials'):
|
cloudnoisemodel only imports from construction when needed
Apparently it's not needed very often, since it looks as though that
branch has never been run | @@ -18,7 +18,6 @@ from . import spamvec as _sv
from . import povm as _povm
from . import qubitgraph as _qgraph
from . import labeldicts as _ld
-from .. import construction as _construction
from ..tools import optools as _gt
from ..tools import basistools as _bt
from ..tools import internalgates as _itgs
@@ -561,8 +560,10 @@ class CloudNoiseModel(_ImplicitOpModel):
basis1Q = _BuiltinBasis("pp",4)
prep_factors = []; povm_factors = []
- v0 = _construction.basis_build_vector("0", basis1Q)
- v1 = _construction.basis_build_vector("1", basis1Q)
+ from ..construction import basis_build_vector
+
+ v0 = basis_build_vector("0", basis1Q)
+ v1 = basis_build_vector("1", basis1Q)
# Historical use of TP for non-term-based cases?
# - seems we could remove this. FUTURE REMOVE?
|
GDB helpers: adapt ReferencedEnv pretty-printers to new implementation
TN: | @@ -318,26 +318,8 @@ class ReferencedEnvPrinter(BasePrinter):
)
)
- @property
- def resolver_name(self):
- """
- If we can determine the name of the property for this resolver, return
- it. Return None otherwise.
- """
- resolver = self.value['resolver']
- m = re.match(r'0x[a-f0-9]+ <.*\.([a-z_]+)>', str(resolver))
- if m:
- return m.group(1)
-
def to_string(self):
- from_node = self.value['from_node']
- resolver = self.value['resolver']
-
- resolver_name = self.resolver_name
- if resolver_name:
- return '{}.{}'.format(from_node, resolver_name)
- else:
- return '{} ({})'.format(resolver, from_node)
+ return EnvGetterPrinter(self.value['getter'], self.context).image
class EntityPrinter(BasePrinter):
|
Skip unit test for module mappings with conflicts for Tmod4
The scenario is invalid for this modules system. | @@ -255,6 +255,9 @@ def test_emit_loadenv_commands_with_confict(base_environ, user_runtime,
def test_emit_loadenv_commands_mapping_with_conflict(base_environ,
user_runtime,
modules_system):
+ if modules_system.name == 'tmod4':
+ pytest.skip('test scenario not valid for tmod4')
+
e0 = env.Environment(name='e0', modules=['testmod_ext'])
ms = rt.runtime().modules_system
ms.load_mapping('testmod_ext: testmod_ext testmod_foo')
|
v5.3.4.0
Changelog: | @@ -3671,14 +3671,15 @@ if form.getvalue('loadopenvpn'):
openvpn_sess = ''
openvpn = ''
- try:
- os_name = platform.linux_distribution()[0]
- except Exception:
- os_name = ''
-
- if distro.id() != 'ubuntu':
+ if distro.id() == 'ubuntu':
+ stdout, stderr = funct.subprocess_execute("apt show openvpn3 2>&1|grep E:")
+ elif distro.id() == 'centos' or distro.id() == 'rhel':
stdout, stderr = funct.subprocess_execute("rpm --query openvpn3-client")
- if stdout[0] != 'package openvpn3-client is not installed' and stderr != '/bin/sh: rpm: command not found':
+
+ if (
+ (stdout[0] != 'package openvpn3-client is not installed' and stderr != '/bin/sh: rpm: command not found') and
+ stdout[0] != 'E: No packages found'
+ ):
cmd = "sudo openvpn3 configs-list |grep -E 'ovpn|(^|[^0-9])[0-9]{4}($|[^0-9])' |grep -v net|awk -F\" \" '{print $1}'|awk 'ORS=NR%2?\" \":\"\\n\"'"
openvpn_configs, stderr = funct.subprocess_execute(cmd)
cmd = "sudo openvpn3 sessions-list|grep -E 'Config|Status'|awk -F\":\" '{print $2}'|awk 'ORS=NR%2?\" \":\"\\n\"'| sed 's/^ //g'"
|
Remove unnecessary conditional
`state_like` is definitely a numpy array (see if statement on line 304). | @@ -305,13 +305,12 @@ def _state_like_to_state_tensor(*, state_like: 'cirq.STATE_VECTOR_LIKE',
prod = np.prod(qid_shape, dtype=int)
if len(qid_shape) == prod:
- if (not isinstance(state_like, np.ndarray) or
- state_like.dtype.kind != 'c'):
+ if state_like.dtype.kind != 'c':
raise ValueError(
'Because len(qid_shape) == product(qid_shape), it is '
- 'ambiguous whether or not the given `state_like` is a '
- 'state vector or a list of computational basis values for '
- 'the qudits. In this situation you are required to pass '
+ 'ambiguous whether the given `state_like` contains '
+ 'state vector amplitudes or per-qudit computational basis '
+ 'values. In this situation you are required to pass '
'in a state vector that is a numpy array with a complex '
'dtype.')
|
removing error help text seperately for domain and app
using gettext to trnslation | @@ -24,7 +24,7 @@ hqDefine("app_manager/js/app_view_application", function() {
domain = $form.find("#id_domain").val(),
$modal = $("#copy-toggles");
- if(!validateCopyApplicationForm($form)){
+ if(!isCopyApplicationFormValid($form)){
return false;
}
@@ -75,7 +75,7 @@ hqDefine("app_manager/js/app_view_application", function() {
* @param form
* @returns {boolean}
*/
- var validateCopyApplicationForm = function(form){
+ var isCopyApplicationFormValid = function(form){
var domainDiv = form.find("#div_id_domain"),
appNameDiv = form.find("#div_id_name"),
domain = domainDiv.find("#id_domain"),
@@ -84,16 +84,18 @@ hqDefine("app_manager/js/app_view_application", function() {
domainNames = initial_page_data("domain_names");
appNameDiv.removeClass('has-error');
+ domainDiv.find('.help-block').remove();
+
domainDiv.removeClass('has-error');
- form.find('.help-block').remove();
+ appNameDiv.find('.help-block').remove();
//if application name is not entered
- if(!appName.val()){
+ if(!appName.val().trim()){
appNameDiv.addClass('has-error');
error = true;
- var apperrorMessage = 'Application name is required';
+ var appErrorMessage = gettext('Application name is required');
- appName.after($("<span class=\"help-block\"></span>").text(apperrorMessage));
+ appName.after($("<span class=\"help-block\"></span>").text(appErrorMessage));
}
//if project/domain is not selected or invalid domain is selected
@@ -101,13 +103,13 @@ hqDefine("app_manager/js/app_view_application", function() {
domainDiv.addClass('has-error');
error = true;
- var domainerrorMessage = 'Invalid Project Selected';
+ var domainErrorMessage = gettext('Invalid Project Selected');
if(!domain.val()){
- domainerrorMessage = 'Project name is required';
+ domainErrorMessage = gettext('Project name is required');
}
- domain.after($("<span class=\"help-block\"></span>").text(domainerrorMessage));
+ domain.after($("<span class=\"help-block\"></span>").text(domainErrorMessage));
}
return !error;
|
Update mgb2.py
remove the global namespace import | @@ -11,8 +11,7 @@ and a similar evaluation set of 10 hours.
Both the development and evaluation data have been released in the 2016 MGB challenge
"""
-import logging
-import tarfile
+from logging import info
from itertools import chain
from pathlib import Path
from typing import Dict, Optional, Union
@@ -39,7 +38,7 @@ def download_mgb2(
:param target_dir: Pathlike, the path of the dir to storage the dataset.
"""
- logging.info(
+ info(
"MGB2 is not available for direct download. Please fill out the form at"
" https://arabicspeech.org/mgb2 to download the corpus."
)
@@ -93,11 +92,11 @@ def prepare_mgb2(
)
for part in dataset_parts:
- logging.info(f"Processing MGB2 subset: {part}")
+ info(f"Processing MGB2 subset: {part}")
if manifests_exist(
part=part, output_dir=output_dir, prefix="mgb2", suffix="jsonl.gz"
):
- logging.info(f"MGB2 subset: {part} already prepared - skipping.")
+ info(f"MGB2 subset: {part} already prepared - skipping.")
continue
# Read the recordings and write them into manifest. We additionally store the
|
Add git-ignored files in flake8 excluded files
Since now flake8 runs locally before pushing a commit remotely, it may
throw errors for ignored files. This commit fixes that by adding all the
files in .gitignore in all the lists of excluded files inside the script
check_flake8. | @@ -36,7 +36,10 @@ W605 # invalid escape sequence
# Files or directories to exclude from checks applied to all files.
exclude_all=(
-./tests/'*',.git,__pycache__
+./tests/'*'
+.git
+__pycache__
+$(cat .gitignore)
)
@@ -56,6 +59,7 @@ W504 # line break after binary operator
# Files or directories to exclude from checks applied to changed files.
exclude_changed=(
./tests/'*'
+$(cat .gitignore)
)
@@ -70,6 +74,7 @@ W503 # line break before binary operator
# Files or directories to exclude from checks applied to added files.
exclude_added=(
./tests/'*'
+$(cat .gitignore)
)
@@ -87,7 +92,9 @@ W503 # line break before binary operator
# Files or directories to exclude from checks applied to all files.
test_exclude_all=(
- .git,__pycache__
+.git
+__pycache__
+$(cat .gitignore)
)
@@ -100,6 +107,7 @@ D # docstring rules disabled
# Files or directories to exclude from checks applied to changed test files.
test_exclude_changed=(
+$(cat .gitignore)
)
@@ -112,6 +120,7 @@ D # docstring rules disabled
# Files or directories to exclude from checks applied to added test files.
test_exclude_added=(
+$(cat .gitignore)
)
|
fix: Fix delete sfv twice issue
fix delete sfv twice issue | @@ -930,7 +930,10 @@ class FeatureStore:
views_to_delete = [
ob
for ob in objects_to_delete
- if isinstance(ob, FeatureView) or isinstance(ob, BatchFeatureView)
+ if (
+ (isinstance(ob, FeatureView) or isinstance(ob, BatchFeatureView))
+ and not isinstance(ob, StreamFeatureView)
+ )
]
request_views_to_delete = [
ob for ob in objects_to_delete if isinstance(ob, RequestFeatureView)
|
forms: Save realm_creation setting on RegistrationForm.
This will be useful for making the checking behavior depend on the
status of this form. | @@ -80,7 +80,7 @@ class RegistrationForm(forms.Form):
# Since the superclass doesn't except random extra kwargs, we
# remove it from the kwargs dict before initializing.
- realm_creation = kwargs['realm_creation']
+ self.realm_creation = kwargs['realm_creation']
del kwargs['realm_creation']
super(RegistrationForm, self).__init__(*args, **kwargs)
@@ -88,7 +88,7 @@ class RegistrationForm(forms.Form):
self.fields['terms'] = forms.BooleanField(required=True)
self.fields['realm_name'] = forms.CharField(
max_length=Realm.MAX_REALM_NAME_LENGTH,
- required=realm_creation)
+ required=self.realm_creation)
def clean_full_name(self):
# type: () -> Text
|
Improve proxy vars in user_variables.yml
1. Change comments to make it clear that proxy_env_url is used by
apt-cacher-ng and must always be set when an http proxy is required.
2. Add keystone hosts to no_proxy list in deployment env vars
because os_keystone role tests this addess to check services are
up. | @@ -118,8 +118,10 @@ debug: false
# placed both on the hosts and inside the containers.
## Example environment variable setup:
-## (1) This sets up a permanent environment, used during and after deployment:
+## This is used by apt-cacher-ng to download apt packages:
# proxy_env_url: http://username:[email protected]:9000/
+
+## (1) This sets up a permanent environment, used during and after deployment:
# no_proxy_env: "localhost,127.0.0.1,{{ internal_lb_vip_address }},{{ external_lb_vip_address }},{% for host in groups['all_containers'] %}{{ hostvars[host]['container_address'] }}{% if not loop.last %},{% endif %}{% endfor %}"
# global_environment_variables:
# HTTP_PROXY: "{{ proxy_env_url }}"
@@ -131,9 +133,9 @@ debug: false
#
## (2) This is applied only during deployment, nothing is left after deployment is complete:
# deployment_environment_variables:
-# http_proxy: http://username:[email protected]:9000/
-# https_proxy: http://username:[email protected]:9000/
-# no_proxy: "localhost,127.0.0.1,{{ internal_lb_vip_address }},{{ external_lb_vip_address }}"
+# http_proxy: "{{ proxy_env_url }}"
+# https_proxy: "{{ proxy_env_url }}"
+# no_proxy: "localhost,127.0.0.1,{{ internal_lb_vip_address }},{{ external_lb_vip_address }},{% for host in groups['keystone_all'] %}{{ hostvars[host]['container_address'] }}{% if not loop.last %},{% endif %}{% endfor %}"
## SSH connection wait time
|
Small cleanup in DB replicator's error handling
Generally, we shouldn't compare integers with "is". It happens to work
because (a) CPython has only one instance of each integer betwen -5
and 256, and (b) errno.ENOTEMPTY == 66 on Linux, but it's better not
to rely on those details. | @@ -110,7 +110,7 @@ def roundrobin_datadirs(datadirs):
try:
os.rmdir(hash_dir)
except OSError as e:
- if e.errno is not errno.ENOTEMPTY:
+ if e.errno != errno.ENOTEMPTY:
raise
its = [walk_datadir(datadir, node_id) for datadir, node_id in datadirs]
|
change: Integration Tests now dynamically checks AZs
Canaries have failed in Tokyo (ap-northeast-1) since inception because
the ap-northeast-1b region doesn't exist.
This commit makes the check dynamic instead of blindly assuming that
a and b exist, as that isn't the case for 3 regions:
ap-northeast-1
ap-northeast-2
sa-east-1 | @@ -51,13 +51,16 @@ def _create_vpc_with_name(ec2_client, region, name):
vpc_id = ec2_client.create_vpc(CidrBlock="10.0.0.0/16")["Vpc"]["VpcId"]
print("created vpc: {}".format(vpc_id))
- # sagemaker endpoints require subnets in at least 2 different AZs for vpc mode
+ availability_zone_name = ec2_client.describe_availability_zones()["AvailabilityZones"][0][
+ "ZoneName"
+ ]
+
subnet_id_a = ec2_client.create_subnet(
- CidrBlock="10.0.0.0/24", VpcId=vpc_id, AvailabilityZone=(region + "a")
+ CidrBlock="10.0.0.0/24", VpcId=vpc_id, AvailabilityZone=availability_zone_name
)["Subnet"]["SubnetId"]
print("created subnet: {}".format(subnet_id_a))
subnet_id_b = ec2_client.create_subnet(
- CidrBlock="10.0.1.0/24", VpcId=vpc_id, AvailabilityZone=(region + "b")
+ CidrBlock="10.0.1.0/24", VpcId=vpc_id, AvailabilityZone=availability_zone_name
)["Subnet"]["SubnetId"]
print("created subnet: {}".format(subnet_id_b))
|
Modify public file query to prevent postgres hanging
[#PLAT-1021] | @@ -38,10 +38,10 @@ class FileSummary(SummaryAnalytics):
target_content_type=ContentType.objects.get_for_model(QuickFilesNode)
)
- public_query = (quickfiles_query | Q(
+ public_query = Q(
target_object_id__in=Node.objects.filter(is_public=True).values('id'),
target_content_type=node_content_type
- ))
+ )
private_query = Q(
target_object_id__in=Node.objects.filter(is_public=False).values('id'),
@@ -57,10 +57,10 @@ class FileSummary(SummaryAnalytics):
# OsfStorageFiles - the number of files on OsfStorage
'osfstorage_files_including_quickfiles': {
'total': file_qs.count(),
- 'public': file_qs.filter(public_query).count(),
+ 'public': file_qs.filter(public_query).count() + file_qs.filter(quickfiles_query).count(),
'private': file_qs.filter(private_query).count(),
'total_daily': file_qs.filter(daily_query).count(),
- 'public_daily': file_qs.filter(public_query & daily_query).count(),
+ 'public_daily': file_qs.filter(public_query & daily_query).count() + file_qs.filter(quickfiles_query & daily_query).count(),
'private_daily': file_qs.filter(private_query & daily_query).count(),
},
}
|
Cycles Renderer : Remove unused private field
To fix the following Clang error :
```
src/GafferCycles/IECoreCyclesPreview/Renderer.cpp:4254:9: error: private field 'm_lastProgress' is not used [-Werror,-Wunused-private-field]
float m_lastProgress;
``` | @@ -4251,7 +4251,6 @@ class CyclesRenderer final : public IECoreScenePreview::Renderer
IECore::MessageHandlerPtr m_messageHandler;
string m_lastError;
string m_lastStatus;
- float m_lastProgress;
double m_lastStatusTime;
// Caches
|
Improve Tox test harness
No need for both pycodestyle and flake8; flake8 includes the former.
Use the proper pytest invocation. | [tox]
-envlist = py27,py34,py35,py36,pycodestyle,isort-check
+envlist = py27,py34,py35,py36,style,isort-check
[testenv]
deps =
@@ -10,22 +10,17 @@ deps =
coverage
taxii2-client
commands =
- py.test --ignore=stix2/test/test_workbench.py --cov=stix2 stix2/test/ --cov-report term-missing
- py.test stix2/test/test_workbench.py --cov=stix2 --cov-report term-missing --cov-append
+ pytest --ignore=stix2/test/test_workbench.py --cov=stix2 stix2/test/ --cov-report term-missing
+ pytest stix2/test/test_workbench.py --cov=stix2 --cov-report term-missing --cov-append
passenv = CI TRAVIS TRAVIS_*
-[testenv:pycodestyle]
+[testenv:style]
deps =
flake8
- pycodestyle
commands =
- pycodestyle ./stix2
flake8
-[pycodestyle]
-max-line-length=160
-
[flake8]
max-line-length=160
@@ -37,7 +32,7 @@ commands =
[travis]
python =
- 2.7: py27, pycodestyle
- 3.4: py34, pycodestyle
- 3.5: py35, pycodestyle
- 3.6: py36, pycodestyle
+ 2.7: py27, style
+ 3.4: py34, style
+ 3.5: py35, style
+ 3.6: py36, style
|
Pontoon: Update Asturian (ast) localization of AMO
Localization authors:
Enol | @@ -5,7 +5,7 @@ msgstr ""
"Project-Id-Version: PACKAGE VERSION\n"
"Report-Msgid-Bugs-To: EMAIL@ADDRESS\n"
"POT-Creation-Date: 2018-09-25 06:38+0000\n"
-"PO-Revision-Date: 2018-09-24 01:27+0000\n"
+"PO-Revision-Date: 2018-09-18 00:42+0000\n"
"Last-Translator: Enol <[email protected]>\n"
"Language-Team: LANGUAGE <[email protected]>\n"
"Language: ast\n"
@@ -219,7 +219,6 @@ msgid_plural "{0} users"
msgstr[0] "{0} usuariu"
msgstr[1] "{0} usuarios"
-#, fuzzy
msgid "{0} add-on"
msgid_plural "{0} add-ons"
msgstr[0] "{0} complementu"
|
ci: retry if rate limited in roulette
[skip ci] | @@ -4,8 +4,10 @@ import re
import shlex
import subprocess
import sys
+import time
import urllib.request
from functools import lru_cache
+from urllib.error import HTTPError
@lru_cache(maxsize=None)
@@ -15,11 +17,25 @@ def fetch_pr_data(pr_number, repo, endpoint=""):
if endpoint:
api_url += f"/{endpoint}"
- req = urllib.request.Request(api_url)
- res = urllib.request.urlopen(req)
+ res = req(api_url)
return json.loads(res.read().decode("utf8"))
+def req(url):
+ "Simple resilient request call to handle rate limits."
+ retries = 0
+ while True:
+ try:
+ req = urllib.request.Request(url)
+ return urllib.request.urlopen(req)
+ except HTTPError as exc:
+ if exc.code == 403 and retries < 5:
+ retries += 1
+ time.sleep(retries)
+ continue
+ raise
+
+
def get_files_list(pr_number, repo="frappe/frappe"):
return [change["filename"] for change in fetch_pr_data(pr_number, repo, "files")]
|
Add region support for af-south-1 (Cape Town)
SIM
CR | @@ -34,6 +34,7 @@ def get_all_regions():
Region('eu-north-1', 'EU (Stockholm)'),
Region('ap-east-1', 'Asia Pacific (Hong Kong)'),
Region('me-south-1', 'Middle East (Bahrain)'),
+ Region('af-south-1', 'Africa (Cape Town)'),
]
|
dse: Use xreplace_constrained in compact_temporaries
xreplace required to be called iteratively
to catch all subs, I believe due to nested subs | @@ -280,7 +280,9 @@ def compact_temporaries(exprs):
for k, v in g.items():
if k not in mapper:
# The temporary /v/ is retained, and substitutions may be applied
- processed.append(v.xreplace(mapper))
+ handle, _ = xreplace_constrained(v, mapper, repeat=True)
+ assert len(handle) == 1
+ processed.extend(handle)
return processed
|
.travis, osx: Bump python3.7 to 3.7.1
Released 2018-10-20 | @@ -54,7 +54,7 @@ matrix:
python: 3.7
language: generic
env:
- - PYTHON=3.7.0
+ - PYTHON=3.7.1
- PYTHON_PKG_VERSION=macosx10.9
- PRIV=sudo
# Cache installed python virtual env
|
Bug enumerating experiment IDs
In an environment like hours were a few people were using the same
database over a long period of time, the list of experiment IDs
was getting too long.
This is a temporary workaround, but we'll need a more permanent solution. | @@ -553,7 +553,12 @@ class MongodbMgdConn :
try :
_collection_handle = self.mongodb_conn[self.database][collection]
- _experiment_list = _collection_handle.distinct('expid')
+ #_experiment_list = _collection_handle.distinct('expid')
+
+ # The document is getting too big, but a workaround was found.
+ # TODO: Find a more permanent solution to this.
+ _experiment_list_agg = _collection_handle.aggregate([ {"$group": {"_id": '$expid'}} ])
+ _experiment_list = ([_v['_id'] for _v in _experiment_list_agg])
if disconnect_finish :
self.disconnect()
|
Just disable a server when it fails to start
As suggested in | @@ -380,11 +380,11 @@ class WindowManager(object):
message = "\n\n".join([
"Could not start {}",
"{}",
- "Do you want to disable it temporarily?"
+ "Server will be disabled for this window"
]).format(config.name, str(e))
- if self._sublime.ok_cancel_dialog(message, "Disable"):
self._configs.disable(config.name)
+ self._sublime.message_dialog(message)
if session:
debug("window {} added session {}".format(self._window.id(), config.name))
|
hotspots: Decrease default hotspot icon z-index to 100.
This keeps hotspot icons positioned at the front of the message
viewport but behind sidebars (i.e. the left sidebar has a z-index
of 103). Hotspots associated with elements outside of the message
viewport should be individually adjusted at the bottom of hotspots.css. | .hotspot-icon {
position: fixed;
cursor: pointer;
- z-index: 103;
+ z-index: 100;
}
.hotspot-icon .dot {
border-left-color: hsl(0, 0%, 80%);
margin-top: -13px;
}
+
+/* individual icon z-indexing */
+#hotspot_stream_settings_icon {
+ z-index: 103;
+}
|
Minor changes for sphinx
Python version added
sphinx is now installed also. | @@ -12,9 +12,12 @@ jobs:
steps:
- uses: actions/checkout@v1
+ with:
+ 'python-version': '3.x'
- name: Run documentation script
run: |
#html documentation generation
+ pip install -U sphinx
cd docs && make html && cd ..;
# html documentation generation through doctr
pip install doctr;
|
(doc) update docker-toolbox
wording change | @@ -107,7 +107,7 @@ Running pre-create checks...
Error with pre-create check:
```
-This appears to be the installation isn't able to find the boot2docker.iso file in the Docker Toolbox installation folder or the user may behind a firewall or a proxy. The solution is download the boot2docker.iso manually and place it to the correct path then re-run docker quickstart terminal.
+This can arise if the installation is unable to find the `boot2docker.iso` file in the Docker Toolbox installation folder or if the user is behind a firewall or a proxy. The solution is to download the `boot2docker.iso` manually and place it in the correct path, then re-run docker quickstart terminal.
```bash
# Docker Cache Path, change `YOUR_USERNAME`
@@ -116,7 +116,7 @@ C:/Users/YOUR_USERNAME/.docker/machine/cache/boot2docker.iso
# Download link
https://github.com/boot2docker/boot2docker/releases/download/v19.03.4/boot2docker.iso
```
-Or you can use `curl` under command prompt with administrative rights.
+Alternatively, you can use `curl` in the command prompt. This method requires [administrative rights](https://windows101tricks.com/open-command-prompt-as-administrator-windows-10/).
```bash
# Docker Cache Path, change `YOUR_USERNAME`
@@ -124,7 +124,7 @@ curl -Lo C:/Users/YOUR_USERNAME/.docker/machine/cache/boot2docker.iso https://gi
```
!!! note
- If your Windows 10 build is 17063(or later) curl is included by default. All you need to do is run Command Prompt with administrative rights and you can use curl. The curl.exe is located at C:\Windows\System32. If you want to be able to use curl from anywhere, consider adding it to Path Environment Variables.
+ If your Windows 10 build is 17063(or later) curl is installed by default. All you need to do is run Command Prompt with administrative rights and you can use curl. The curl.exe is located at C:\Windows\System32. If you want to be able to use curl from anywhere, consider adding it to Path Environment Variables.
## Running Hummingbot
|
fw/version: Bump dev version.
We are relying on a newly available variable in devlib
so bump the version to remain in sync. | @@ -23,7 +23,7 @@ VersionTuple = namedtuple('Version', ['major', 'minor', 'revision', 'dev'])
version = VersionTuple(3, 2, 1, 'dev4')
-required_devlib_version = VersionTuple(1, 2, 1, 'dev4')
+required_devlib_version = VersionTuple(1, 2, 1, 'dev5')
def format_version(v):
|
Updated sophos-mobile-panel.yaml
Added version extractor and version listing download URL in reference. | @@ -2,9 +2,11 @@ id: sophos-mobile-panel
info:
name: Sophos Mobile Panel Detect
- author: Adam Crosser
+ author: Adam Crosser,idealphase
severity: info
- reference: https://www.sophos.com/en-us/products/mobile-control
+ reference:
+ - https://www.sophos.com/en-us/products/mobile-control
+ - https://www.sophos.com/en-us/support/downloads/sophos-mobile
metadata:
shodan-query: http.title:"Sophos Mobile"
tags: panel,sophos
@@ -18,3 +20,9 @@ requests:
- type: word
words:
- "<title>Sophos Mobile</title>"
+
+ extractors:
+ - type: regex
+ group: 1
+ regex:
+ - 'src="\/javax\.faces\.resource\/texteditor\/texteditor\.js\.xhtml\?ln=primefaces&(.+)"><\/script>'
|
Don't set backup variable to new variable value.
The backup variable in the installation script `IFS_bak` should be set prior to the change of value. | @@ -39,8 +39,8 @@ getBillingAccount() {
exit 1;
elif [[ $(echo "${found_accounts}" | wc -l) -gt 1 ]]; then
log "Which billing account would you like to use?:"
- IFS=$'\n'
IFS_bak=$IFS
+ IFS=$'\n'
select opt in ${found_accounts} "cancel"; do
if [[ "${opt}" == "cancel" ]]; then
exit 0
|
Stick our script-completion logic into a documented method
Currently, script-cleanup logic is run from a `finally` clause toward the end of the long-ish `ScriptRunner._run_script` method.
This PR just moves script-cleanup logic into its own documented function, to make it easier to find and understand. | @@ -25,7 +25,7 @@ from streamlit import source_util
from streamlit import util
from streamlit.error_util import handle_uncaught_app_exception
from streamlit.media_file_manager import media_file_manager
-from streamlit.report_thread import ReportThread
+from streamlit.report_thread import ReportThread, ReportContext
from streamlit.report_thread import get_report_ctx
from streamlit.script_request_queue import ScriptRequest
from streamlit.logger import get_logger
@@ -346,12 +346,7 @@ class ScriptRunner(object):
handle_uncaught_app_exception(e)
finally:
- self._widgets.reset_triggers()
- self._widgets.cull_nonexistent(ctx.widget_ids_this_run.items())
- self.on_event.send(ScriptRunnerEvent.SCRIPT_STOPPED_WITH_SUCCESS)
- # delete expired files now that the script has run and files in use
- # are marked as active
- media_file_manager.del_expired_files()
+ self._on_script_finished(ctx)
# Use _log_if_error() to make sure we never ever ever stop running the
# script without meaning to.
@@ -360,6 +355,19 @@ class ScriptRunner(object):
if rerun_with_data is not None:
self._run_script(rerun_with_data)
+ def _on_script_finished(self, ctx: ReportContext) -> None:
+ """Called when our script finishes executing, even if it finished
+ early with an exception. We perform post-run cleanup here.
+ """
+ self._widgets.reset_triggers()
+ self._widgets.cull_nonexistent(ctx.widget_ids_this_run.items())
+ # Signal that the script has finished. (We use SCRIPT_STOPPED_WITH_SUCCESS
+ # even if we were stopped with an exception.)
+ self.on_event.send(ScriptRunnerEvent.SCRIPT_STOPPED_WITH_SUCCESS)
+ # Delete expired files now that the script has run and files in use
+ # are marked as active.
+ media_file_manager.del_expired_files()
+
class ScriptControlException(BaseException):
"""Base exception for ScriptRunner."""
|
update desktop app troubleshooting
* update desktop app troubleshooting
added an additional command to open the dev-tools for specific servers in the desktop app.
* Update source/install/desktop.rst | @@ -188,6 +188,11 @@ When reporting bugs found in the Mattermost Desktop app, it is helpful to includ
4. Save the file and then send it along with a description of your issue.
5. Go to ``View`` > ``Toggle Developer Tools`` to disable the Developer Tools.
+You can open an additional set of developer tools for each server you have added to the desktop app.
+The tools can be opened by pasting this command in the developer console you opened with the steps described above: ``document.getElementsByTagName("webview")[0].openDevTools();``
+
+Note that if you have more than one server added to the desktop client, you need to change the ``0`` to the number corresponding to the server you want to open in the developer tools, starting with ``0`` from the left.
+
Windows
~~~~~~~
|
test(conf.py): remove `os.linesep`
This should be used when parsing binary files, not text streams. The
regular expression fails on Windows because the stream contains `\n`,
but `os.linesep` is `\r\n`.
For reference, see | @@ -28,7 +28,7 @@ def tmp_commitizen_project(tmp_git_project):
def _get_gpg_keyid(signer_mail):
_new_key = cmd.run(f"gpg --list-secret-keys {signer_mail}")
_m = re.search(
- rf"[a-zA-Z0-9 \[\]-_]*{os.linesep}[ ]*([0-9A-Za-z]*){os.linesep}[{os.linesep}a-zA-Z0-9 \[\]-_<>@]*",
+ r"[a-zA-Z0-9 \[\]-_]*\n[ ]*([0-9A-Za-z]*)\n[\na-zA-Z0-9 \[\]-_<>@]*",
_new_key.out,
)
return _m.group(1) if _m else None
@@ -42,8 +42,8 @@ def tmp_commitizen_project_with_gpg(tmp_commitizen_project):
# create a temporary GPGHOME to store a temporary keyring.
# Home path must be less than 104 characters
gpg_home = tempfile.TemporaryDirectory(suffix="_cz")
- # os.environ["GNUPGHOME"] = gpg_home.name # tempdir = temp keyring
- # os.environ["HOMEDIR"] = gpg_home.name
+ if os.name != "nt":
+ os.environ["GNUPGHOME"] = gpg_home.name # tempdir = temp keyring
# create a key (a keyring will be generated within GPUPGHOME)
c = cmd.run(
|
Updated to use fstrings
Also made the same changes requested by flaming on the apprise pull | @@ -510,13 +510,11 @@ def changeparams():
app.logger.debug(f"main={config.MAINFEATURE}")
job.disctype = format(form.DISCTYPE.data)
db.session.commit()
- flash(
- 'Parameters changed. Rip Method={}, Main Feature={}, Minimum Length={}, '
- 'Maximum Length={}, Disctype={}'.format(
- form.RIPMETHOD.data, form.MAINFEATURE.data, form.MINLENGTH.data, form.MAXLENGTH.data,
- form.DISCTYPE.data))
db.session.refresh(job)
db.session.refresh(config)
+ flash(f'Parameters changed. Rip Method={config.RIPMETHOD}, Main Feature={config.MAINFEATURE},'
+ f'Minimum Length={config.MINLENGTH}, '
+ f'Maximum Length={config.MAXLENGTH}, Disctype={job.disctype}')
return redirect(url_for('home'))
return render_template('changeparams.html', title='Change Parameters', form=form)
@@ -532,7 +530,7 @@ def customtitle():
job.title = format(form.title.data)
job.year = format(form.year.data)
db.session.commit()
- flash('custom title changed. Title={}, Year={}.'.format(form.title.data, form.year.data))
+ flash(f'custom title changed. Title={form.title.data}, Year={form.year.data}.')
return redirect(url_for('home'))
return render_template('customTitle.html', title='Change Title', form=form)
|
Write regression test intentionally negated
That way it starts failing once is fixed, notifying us of the fix.
Then the test can be reversed again. | @@ -34,7 +34,7 @@ test_storage_investment/test_storage_investment.py
SPDX-License-Identifier: GPL-3.0-or-later
"""
-from nose.tools import eq_
+from nose.tools import eq_, ok_
from oemof.tools import economics
@@ -202,5 +202,8 @@ def test_solph_transformer_attributes_before_dump_and_after_restore():
[x for x in dir(energysystem.groups['pp_gas']) if '__' not in x])
# Compare parameter before the dump and after the restore
- eq_(trsf_attr_before_dump, trsf_attr_after_restore)
+ # Once #474 is fixed, this test will start to fail, which will make you
+ # realize, that #474 is fixed. Just change the `!=` to `==` (or rewrite
+ # test using `eq_`) and boom, you have a nice regression test.
+ ok_(trsf_attr_before_dump != trsf_attr_after_restore)
|
Update rmsrat.txt
Seems FP. | @@ -91,13 +91,6 @@ sickly.jumpingcrab.com
202.58.105.80:5073
9.wqkwc.cn
-# Reference: https://www.virustotal.com/gui/file/ddf26651eb45bfdedae1f688e539683c3d54fdf808ed791f5dfe75386f948ca1/detection
-
-188.172.219.157:5938
-185.188.32.4:5938
-94.16.6.164:5938
-4.wqkwc.cn
-
# Generic trails
/inet_id_notify.php
|
Somehow recent file updates is really slow
Comment it out for now. | @@ -40,9 +40,10 @@ class RecentFilesMenu(Gio.Menu):
self._on_recent_manager_changed(recent_manager)
# TODO: should unregister if the window is closed.
- self._changed_id = recent_manager.connect(
- "changed", self._on_recent_manager_changed
- )
+ # TODO: GTK4 - Why is updating the recent files so slow?
+ # TODO: self._changed_id = recent_manager.connect(
+ # TODO: "changed", self._on_recent_manager_changed
+ # TODO: )
def _on_recent_manager_changed(self, recent_manager):
self.remove_all()
|
Remove Yummly
API has been ceased :( | @@ -343,7 +343,6 @@ API | Description | Auth | HTTPS | CORS |
| [TheCocktailDB](https://www.thecocktaildb.com/api.php) | Cocktail Recipes | `apiKey` | Yes | Yes |
| [TheMealDB](https://www.themealdb.com/api.php) | Meal Recipes | `apiKey` | Yes | Yes |
| [What's on the menu?](http://nypl.github.io/menus-api/) | NYPL human-transcribed historical menu collection | `apiKey` | No | Unknown |
-| [Yummly](https://developer.yummly.com/) | Find food recipes | `apiKey` | Yes | Unknown |
| [Zomato](https://developers.zomato.com/api) | Discover restaurants | `apiKey` | Yes | Unknown |
|
fix: Update base timeline
prepare_timeline can have async calls | @@ -40,22 +40,36 @@ class BaseTimeline {
this.timeline_items_wrapper.empty();
this.timeline_items = [];
this.doc_info = this.frm && this.frm.get_docinfo() || {};
- this.prepare_timeline_contents();
-
+ let response = this.prepare_timeline_contents();
+ if (response instanceof Promise) {
+ response.then(() => {
this.timeline_items.sort((item1, item2) => new Date(item1.creation) - new Date(item2.creation));
this.timeline_items.forEach(this.add_timeline_item.bind(this));
+ });
+ } else {
+ this.timeline_items.sort((item1, item2) => new Date(item1.creation) - new Date(item2.creation));
+ this.timeline_items.forEach(this.add_timeline_item.bind(this));
+ }
}
prepare_timeline_contents() {
//
}
- add_timeline_item(item) {
+ add_timeline_item(item, append_at_the_end=false) {
let timeline_item = this.get_timeline_item(item);
+ if (append_at_the_end) {
+ this.timeline_items_wrapper.append(timeline_item);
+ } else {
this.timeline_items_wrapper.prepend(timeline_item);
+ }
return timeline_item;
}
+ add_timeline_items(items, append_at_the_end=false) {
+ items.forEach((item) => this.add_timeline_item(item, append_at_the_end));
+ }
+
get_timeline_item(item) {
// item can have content*, creation*,
// timeline_badge, icon, icon_size,
|
Check a user's rights rather than group memberships
This is a continuation to | @@ -1918,8 +1918,8 @@ class BasePage(UnicodeMixin, ComparableMixin):
pywikibot.output('Deleting %s.' % (self.title(as_link=True)))
reason = pywikibot.input('Please enter a reason for the deletion:')
- # If user is a sysop, delete the page
- if self.site.username(sysop=True):
+ # If user has 'delete' right, delete the page
+ if 'delete' in self.site.userinfo['rights']:
answer = 'y'
if prompt and not hasattr(self.site, '_noDeletePrompt'):
answer = pywikibot.input_choice(
|
comment explaining the rationale of --source pkg
HG--
branch : issue-426 | @@ -811,6 +811,12 @@ class Coverage(object):
not os.path.exists(sys.modules[pkg].__file__)):
continue
pkg_file = source_for_file(sys.modules[pkg].__file__)
+ #
+ # Do not explore the souce tree of a module that is
+ # not a directory tree. For instance if
+ # sys.modules['args'].__file__ == /lib/python2.7/site-packages/args.pyc
+ # we do not want to explore all of /lib/python2.7/site-packages
+ #
if not pkg_file.endswith('__init__.py'):
continue
src_directories.append(self._canonical_dir(sys.modules[pkg]))
|
Langkit_Support.Text: provide several character constants
TN: | @@ -2,6 +2,7 @@ with Ada.Unchecked_Deallocation;
package Langkit_Support.Text is
+ subtype Character_Type is Wide_Wide_Character;
subtype Text_Type is Wide_Wide_String;
-- All our strings are encoded in UTF-32 (native endinannness). This type,
-- which is not a subtype of String, makes it obvious when conversions are
@@ -22,4 +23,15 @@ package Langkit_Support.Text is
procedure Free is new Ada.Unchecked_Deallocation (Text_Type, Text_Access);
+ package Chars is
+ NUL : constant Character_Type :=
+ Wide_Wide_Character'Val (Character'Pos (ASCII.NUL));
+ LF : constant Character_Type :=
+ Wide_Wide_Character'Val (Character'Pos (ASCII.LF));
+ HT : constant Character_Type :=
+ Wide_Wide_Character'Val (Character'Pos (ASCII.HT));
+ ESC : constant Character_Type :=
+ Wide_Wide_Character'Val (Character'Pos (ASCII.ESC));
+ end Chars;
+
end Langkit_Support.Text;
|
Fix is_early_compact_l2a
Reprocessed L2A products are not from an early compact version. They were processed with baseline 02.08. | @@ -300,7 +300,7 @@ class AwsService(ABC):
:rtype: bool
"""
return self.data_collection is DataCollection.SENTINEL2_L2A and self.safe_type is EsaSafeType.COMPACT_TYPE and \
- self.baseline <= '02.06'
+ '00.01' < self.baseline <= '02.06'
class AwsProduct(AwsService):
|
Add get_wait_seconds, process_timeouts args to PollPoller constructor signature.
Remove errant param documentation from Poller constructors. | @@ -848,16 +848,8 @@ class KQueuePoller(_PollerBase):
def __init__(self, get_wait_seconds, process_timeouts):
"""Create an instance of the KQueuePoller
-
- :param int fileno: The file descriptor to check events for
- :param method handler: What is called when an event happens
- :param int events: The events to look for
-
"""
- super(KQueuePoller, self).__init__(
- get_wait_seconds=get_wait_seconds,
- process_timeouts=process_timeouts
- )
+ super(KQueuePoller, self).__init__(get_wait_seconds, process_timeouts)
self._kqueue = None
@@ -984,19 +976,12 @@ class PollPoller(_PollerBase):
"""
POLL_TIMEOUT_MULT = 1000
- def __init__(self):
+ def __init__(self, get_wait_seconds, process_timeouts):
"""Create an instance of the KQueuePoller
- :param int fileno: The file descriptor to check events for
- :param method handler: What is called when an event happens
- :param int events: The events to look for
-
"""
self._poll = None
- super(PollPoller, self).__init__(
- get_wait_seconds=get_wait_seconds,
- process_timeouts=process_timeouts
- )
+ super(PollPoller, self).__init__(get_wait_seconds, process_timeouts)
@staticmethod
def _create_poller():
|
mailmap: Give priority to Scott Feeny's GitHub email.
Email fetched from the public commit at
graue/synth/commit/73bac0f593d75d8eb5fb23dd638a279417ccffd0.patch | @@ -20,6 +20,8 @@ Rishi Gupta <[email protected]> <[email protected]>
Rishi Gupta <[email protected]> <[email protected]>
Rishi Gupta <[email protected]> <[email protected]>
Reid Barton <[email protected]> <[email protected]>
+Scott Feeney <[email protected]> <[email protected]>
+Scott Feeney <[email protected]> <[email protected]>
Steve Howell <[email protected]> <[email protected]>
Steve Howell <[email protected]> <[email protected]>
Steve Howell <[email protected]> <[email protected]>
@@ -37,4 +39,3 @@ Vishnu KS <[email protected]> <[email protected]>
# are no multiple entries in /team page.
Allen Rabinovich <[email protected]> <[email protected]>
Jeff Arnold <[email protected]> <[email protected]>
-Scott Feeney <[email protected]> <[email protected]>
|
Fixes broken handling of cvxpy being absent for report generation.
This commit restores the desired behavior that diamond norm values
are displayed as blanks when cvxpy cannot be imported. This was
broken (so that an ImportError was raised) when reportables.py was
updated into cleaner, independent calculation functions. | @@ -905,9 +905,14 @@ def fro_diff(A, B, mxBasis): # assume vary gateset1, gateset2 fixed
def jt_diff(A, B, mxBasis): # assume vary gateset1, gateset2 fixed
return _tools.jtracedist(A, B, mxBasis)
+try:
+ import cvxpy as _cvxpy
@gates_quantity() # This function changes arguments to (gatesetA, gatesetB, gateLabel, confidenceRegionInfo)
def half_diamond_norm(A, B, mxBasis):
return 0.5 * _tools.diamonddist(A, B, mxBasis)
+except ImportError:
+ def half_diamond_norm(gatesetA, gatesetB, gatelabel, confidenceRegionInfo):
+ return ReportableQty(_np.nan) # report NAN for diamond norms
@gates_quantity() # This function changes arguments to (gatesetA, gatesetB, gateLabel, confidenceRegionInfo)
def unitarity_infidelity(A, B, mxBasis):
|
Typo in custom_program_name.rst
Just a small typo fix inside of custom_program_name.rst | Custom firmware/program name
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-Sometimes is useful to have a different firmware/program name in
+Sometimes it is useful to have a different firmware/program name in
:ref:`projectconf_pio_build_dir`. The following example uses
:ref:`scripting_example_custom_project_conf_options` and adds
a project version suffix to the firmware name.
|
Update kb_category.html
Standardises output between KB listing and KB items rating display (so that both are "n/10", where the listing was previously a float), and captures case where there may be no votes cast. | {% blocktrans with item.get_absolute_url as url %}View <a href='{{ url }}'>Answer <i class="fa fa-arrow-right"></i></a>{% endblocktrans %}
</p>
<div class="well well-sm">
- <p>{% trans 'Rating' %}: {{ item.score }}</p>
+ <p>{% trans 'Rating' %}:
+ {% if item.votes > 0 %}
+ {{ item.score|floatformat }}/10
+ {% else %}
+ {% trans 'no score yet' %}
+ {% endif %}
+ </p>
<p>{% trans 'Last Update' %}: {{ item.last_updated|naturaltime }}</p>
</div>
</div>
|
Add docs on recommended migration patterns
Originally drafted here: | @@ -91,5 +91,6 @@ to ensure we can rebuild the view without issue.
Migration Patterns and Best Practices
-------------------------------------
+- :ref:`migrations-in-practice`
- :ref:`auto-managed-migration-pattern`
- :ref:`couch-to-sql-model-migration`
|
Postprocess json for gatsby build to remove .html links
Summary: Title
Test Plan: `gatsby build && gatsby serve`
Reviewers: max | @@ -95,6 +95,17 @@ async function main() {
sh.ls("-R", `${SPHINX_BUILD_JSON}/**/*.json`).forEach(file => {
const filename = file.replace(SPHINX_BUILD_JSON, "");
const newFilepath = path.join(buildDir, filename);
+
+ // Postprocess file
+ const fileContent = fs.readFileSync(file, "utf8");
+ const result = fileContent.replace(
+ /href=\\"(?!http)[^>]*?html\\">/g,
+ function(match) {
+ return match.replace(".html", "");
+ }
+ );
+ fs.writeFileSync(file, result, "utf8");
+
fs.copySync(file, newFilepath);
console.log(ch.cyan(`Copying ${filename}`));
});
|
tests: Remove module under search from expected results
This matches a tweak made in | @@ -148,7 +148,6 @@ class FindRelatedTest(testlib.TestCase):
import django
related = self.call('django')
self.assertEquals(related, [
- 'django',
'django.utils',
'django.utils.lru_cache',
'django.utils.version',
@@ -164,7 +163,6 @@ class FindRelatedTest(testlib.TestCase):
'django.core',
'django.core.exceptions',
'django.core.signals',
- 'django.db',
'django.db.utils',
'django.dispatch',
'django.dispatch.dispatcher',
@@ -220,7 +218,6 @@ class FindRelatedTest(testlib.TestCase):
'django.db',
'django.db.backends',
'django.db.backends.utils',
- 'django.db.models',
'django.db.models.aggregates',
'django.db.models.base',
'django.db.models.constants',
|
Update the landing page text.
The new text was written by Tizzysaurus in his website cleanup
project, and lifted from his google doc. | <div class="column is-half">
<p>
We're a large community focused around the Python programming language.
- We believe anyone can learn programming, and are very dedicated to helping
- novice developers take their first steps into the world of code. We also
+ We believe anyone can learn to code, and are very dedicated to helping
+ novice developers take their first steps into the world of programming. We also
attract a lot of expert developers who are seeking friendships, collaborators,
- or looking to hone their craft by teaching and getting involved in the community.
+ or to hone their craft by teaching and getting involved with the community.
<br/><br/>
- We organise regular community events like code jams, open source hackathons,
- seasonal events and community challenges. Through our sponsorships and with
- help from donations, we are even able to award prizes to the winners of our events.
+ We organise regular community events such as code jams, open-source hackathons,
+ seasonal events and community challenges. Through our sponsorships, and with
+ the help from donations, many of our events even have prizes to win!
<br/><br/>
You can find help with most Python-related problems in one of our help channels.
- Our staff of nearly 50 dedicated expert Helpers are available around the clock
+ Our staff of over 50 dedicated expert Helpers, are available around the clock
in every timezone. Whether you're looking to learn the language or working on a
complex project, we've got someone who can help you if you get stuck.
</p>
|
test: Use generator for IAM template names
See also: | @@ -5,18 +5,19 @@ from foremast.iam.construct_policy import render_policy_template
from foremast.utils.templates import LOCAL_TEMPLATES
-def test_all_iam_templates():
- """Verify all IAM templates render as proper JSON."""
+def iam_templates():
+ """Generate list of IAM templates."""
jinjaenv = jinja2.Environment(loader=jinja2.FileSystemLoader([LOCAL_TEMPLATES]))
- iam_templates = jinjaenv.list_templates(filter_func=lambda x: all([
+ iam_template_names = jinjaenv.list_templates(filter_func=lambda x: all([
x.startswith('infrastructure/iam/'),
'trust' not in x,
'wrapper' not in x, ]))
- for template in iam_templates:
- *_, service_json = template.split('/')
- service, *_ = service_json.split('.')
+ for iam_template_name in iam_template_names:
+ yield iam_template_name
+
+
items = ['resource1', 'resource2']
|
Don't arbitrarily strip an argument from make_pass_decorator
This fixes | @@ -61,7 +61,7 @@ def make_pass_decorator(object_type, ensure=False):
raise RuntimeError('Managed to invoke callback without a '
'context object of type %r existing'
% object_type.__name__)
- return ctx.invoke(f, obj, *args[1:], **kwargs)
+ return ctx.invoke(f, obj, *args, **kwargs)
return update_wrapper(new_func, f)
return decorator
|
v3.13.0.0
Changelog: | @@ -350,9 +350,9 @@ def update_db_v_3_12_1(**kwargs):
except sqltool.Error as e:
if kwargs.get('silent') != 1:
if e.args[0] == 'duplicate column name: param' or e == "1060 (42S21): Duplicate column name 'param' ":
- print('DB was update to 3.12.1.0')
+ print('Updating... go to version 3.12.1.0')
else:
- print("DB was update to 3.12.1.0")
+ print("Updating... go to version 3.12.1.0")
return False
else:
return True
@@ -384,7 +384,7 @@ def update_db_v_3_13(**kwargs):
def update_ver(**kwargs):
con, cur = get_cur()
- sql = """update version set version = '3.12.2.2'; """
+ sql = """update version set version = '3.13.0.0'; """
try:
cur.execute(sql)
con.commit()
|
Fix migrate action with migration_type 'cold'
Migration action with migration_type 'cold' does not work.
This patch fixes nova_helper to follow Pike release python-novaclient.
Closes-Bug: | @@ -200,11 +200,7 @@ class NovaHelper(object):
new_image_name = getattr(image, "name")
instance_name = getattr(instance, "name")
- flavordict = getattr(instance, "flavor")
- # a_dict = dict([flavorstr.strip('{}').split(":"),])
- flavor_id = flavordict["id"]
- flavor = self.nova.flavors.get(flavor_id)
- flavor_name = getattr(flavor, "name")
+ flavor_name = instance.flavor.get('original_name')
keypair_name = getattr(instance, "key_name")
addresses = getattr(instance, "addresses")
@@ -771,10 +767,9 @@ class NovaHelper(object):
# Make sure all security groups exist
for sec_group_name in sec_group_list:
- try:
- self.nova.security_groups.find(name=sec_group_name)
+ group_id = self.get_security_group_id_from_name(sec_group_name)
- except nvexceptions.NotFound:
+ if not group_id:
LOG.debug("Security group '%s' not found " % sec_group_name)
return
@@ -821,6 +816,14 @@ class NovaHelper(object):
return instance
+ def get_security_group_id_from_name(self, group_name="default"):
+ """This method returns the security group of the provided group name"""
+ security_groups = self.neutron.list_security_groups(name=group_name)
+
+ security_group_id = security_groups['security_groups'][0]['id']
+
+ return security_group_id
+
def get_network_id_from_name(self, net_name="private"):
"""This method returns the unique id of the provided network name"""
networks = self.neutron.list_networks(name=net_name)
|
Fix Discovery Problem options in Report object detail
HG--
branch : feature/microservices | @@ -582,7 +582,7 @@ class ReportObjectDetailApplication(ExtApplication):
for mo in moss:
if mo not in mos_id:
continue
- dp = discovery_problem[mo]
+ dp = discovery_problem.get(mo)
r += [translate_row(row([
mo,
moss[0],
|
[bugfix] use assertRaisesRegex instead of assertRaisesRegexp
assertRaisesRegexp was renamed to assertRaisesRegex with Python 3.2.
Use contextmanager which was introduced with Python 3 instead of
parital parameters | @@ -3604,8 +3604,8 @@ class TestLoginLogout(DefaultSiteTestCase):
self.assertIsNone(site.login())
if site.is_oauth_token_available():
- self.assertRaisesRegexp(api.APIError, 'cannotlogout.*OAuth',
- site.logout)
+ with self.assertRaisesRegex(api.APIError, 'cannotlogout.*OAuth'):
+ site.logout()
self.assertTrue(site.logged_in())
self.assertIn(site._loginstatus, (loginstatus.IN_PROGRESS,
loginstatus.AS_USER))
|
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.