Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (unidentified young man, seated leaning against wall working loom or cloth spinning device)

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.33

Human Generated Data

Title

Untitled (unidentified young man, seated leaning against wall working loom or cloth spinning device)

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.33

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 98.6
Human 98.6
Spoke 93.4
Machine 93.4
Wheel 82.6
Wood 76.4
Plywood 68.5

Clarifai
created on 2023-10-28

people 99.6
portrait 99.5
art 98.4
adult 97.4
vintage 96.7
old 95.7
man 95.7
woman 95.4
one 94.3
painting 94
child 93.9
wear 93.7
documentary 92.7
two 91.6
retro 90.9
sepia 90.8
picture frame 89.9
collage 88.8
window 87.7
girl 86.7

Imagga
created on 2022-02-25

shovel 69.8
hand tool 50.6
tool 40.3
man 18.1
old 15.3
building 15.1
black 13.8
male 13.5
backboard 12.5
equipment 12.4
water 12
architecture 11.7
silhouette 10.8
people 10.6
wall 10.3
grunge 10.2
art 9.8
outdoors 9.7
person 9.5
light 9.3
street 9.2
vintage 9.1
musical instrument 9.1
metal 8.8
working 8.8
sepia 8.7
men 8.6
stone 8.4
beach 8.4
outdoor 8.4
house 8.3
dirty 8.1
sunset 8.1
transportation 8.1
walking 7.6
swab 7.5
travel 7
percussion instrument 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.5
clothing 97
person 96
smile 70.9
old 66.6
posing 65.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Male, 99.8%
Calm 89.6%
Confused 4.5%
Surprised 1.7%
Angry 1.4%
Happy 1%
Sad 0.8%
Fear 0.5%
Disgusted 0.4%

Feature analysis

Amazon

Person
Person 98.6%

Categories