Spaces:
Sleeping
Sleeping
Rename project protention -> hexviz
Browse filesGPT4 says is best
HexViz: In the Discworld series, Hex is a magical computer-like machine
created by the wizards of the Unseen University. HexViz combines the
name Hex with "viz," short for visualization. This name suggests that
the tool has a magical ability to decipher and visualize the
attention patterns within protein structures, much like Hex's
ability to solve complex problems in the Discworld universe.
README.md
CHANGED
@@ -1,4 +1,4 @@
|
|
1 |
-
#
|
2 |
Visualize attention pattern on 3D protein structures
|
3 |
|
4 |
## Install and run
|
@@ -6,5 +6,5 @@ Visualize attention pattern on 3D protein structures
|
|
6 |
```shell
|
7 |
poetry install
|
8 |
|
9 |
-
poetry run streamlit run
|
10 |
```
|
|
|
1 |
+
# hexviz
|
2 |
Visualize attention pattern on 3D protein structures
|
3 |
|
4 |
## Install and run
|
|
|
6 |
```shell
|
7 |
poetry install
|
8 |
|
9 |
+
poetry run streamlit run hexviz/streamlit/Attention_On_Structure.py
|
10 |
```
|
{protention → hexviz}/attention.py
RENAMED
File without changes
|
{protention → hexviz}/streamlit/Attention_On_Structure.py
RENAMED
@@ -3,7 +3,7 @@ import stmol
|
|
3 |
import streamlit as st
|
4 |
from stmol import showmol
|
5 |
|
6 |
-
from
|
7 |
|
8 |
st.sidebar.title("pLM Attention Visualization")
|
9 |
|
|
|
3 |
import streamlit as st
|
4 |
from stmol import showmol
|
5 |
|
6 |
+
from hexviz.attention import Model, ModelType, get_attention_pairs
|
7 |
|
8 |
st.sidebar.title("pLM Attention Visualization")
|
9 |
|
{protention → hexviz}/streamlit/__init__.py
RENAMED
File without changes
|
pyproject.toml
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
[tool.poetry]
|
2 |
-
name = "
|
3 |
version = "0.1.0"
|
4 |
-
description = "Visualize and analyze attention patterns for protein language
|
5 |
authors = ["Aksel Lenes <[email protected]>"]
|
6 |
|
7 |
[tool.poetry.dependencies]
|
|
|
1 |
[tool.poetry]
|
2 |
+
name = "hexviz"
|
3 |
version = "0.1.0"
|
4 |
+
description = "Visualize and analyze attention patterns for protein language models on structures"
|
5 |
authors = ["Aksel Lenes <[email protected]>"]
|
6 |
|
7 |
[tool.poetry.dependencies]
|
tests/test_attention.py
CHANGED
@@ -2,9 +2,9 @@ import torch
|
|
2 |
from Bio.PDB.Structure import Structure
|
3 |
from transformers import T5EncoderModel, T5Tokenizer
|
4 |
|
5 |
-
from
|
6 |
-
|
7 |
-
|
8 |
|
9 |
|
10 |
def test_get_structure():
|
|
|
2 |
from Bio.PDB.Structure import Structure
|
3 |
from transformers import T5EncoderModel, T5Tokenizer
|
4 |
|
5 |
+
from hexviz.attention import (ModelType, get_attention, get_protT5,
|
6 |
+
get_sequences, get_structure,
|
7 |
+
unidirectional_sum_filtered)
|
8 |
|
9 |
|
10 |
def test_get_structure():
|