Human Generated Data

Title

Untitled (two men inspecting locks at York Safe & Lock Co.)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8618

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men inspecting locks at York Safe & Lock Co.)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8618

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.8
Clothing 97.1
Apparel 97.1
Person 91
Chess 90.9
Game 90.9
Person 90
Person 74.4
Home Decor 73.4
Person 72.2
People 65.2
Overcoat 62.7
Coat 62.7
Suit 60.9
Screen 60
Electronics 60
Shorts 59.6
Face 59.4
Poster 58.9
Advertisement 58.9
Monitor 58.1
Display 58.1
Furniture 57.9

Clarifai
created on 2023-10-25

people 99.9
group 99.4
group together 98.4
many 97.4
adult 97.1
two 96.5
man 96.4
one 95.1
woman 93.3
crowd 91.4
monochrome 90.8
wear 89.7
street 89.2
grow 89.2
administration 88.4
three 88.3
four 85.1
market 84.6
food 84.6
merchant 84.3

Imagga
created on 2022-01-09

fence 23.8
picket fence 17.7
outdoors 16.4
landscape 15.6
building 15.4
structure 14.7
barrier 13.6
travel 13.4
outdoor 13
snow 12.8
wall 12.3
sidewalk 11.8
architecture 11.7
summer 11.6
park 11.5
device 11
wood 10.8
water 10.7
old 10.4
winter 10.2
sky 10.2
beach 10.1
tree 10.1
city 10
texture 9.7
umbrella 9.3
obstruction 9.3
house 9.2
black 9.1
vacation 9
forest 8.7
day 8.6
street 8.3
tourism 8.2
lady 8.1
metal 8
sand 7.9
season 7.8
ancient 7.8
construction 7.7
industry 7.7
line 7.7
leisure 7.5
ocean 7.5
cleaning implement 7.2
home 7.2
portrait 7.1
worker 7.1
steel 7.1
sea 7
scenic 7

Google
created on 2022-01-09

Black 89.7
Black-and-white 85.3
Gesture 85.3
Style 83.8
Line 81.9
Adaptation 79.4
Terrestrial plant 77.9
Font 77.5
Plant 77.4
Monochrome 76.4
Monochrome photography 75.4
Snapshot 74.3
Hat 68.8
Rectangle 66.1
Stock photography 64.4
Art 63.3
History 57.7
Room 57
Vintage clothing 56.7
Street 55.8

Microsoft
created on 2022-01-09

text 97.8
black and white 97.2
outdoor 96.8
clothing 86.2
street 85.4
monochrome 82.3
man 80.8
person 69.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 95.8%
Surprised 65.6%
Calm 23.5%
Sad 5%
Angry 1.6%
Fear 1.2%
Disgusted 1.2%
Confused 1.1%
Happy 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Chess 90.9%

Categories

Text analysis

Amazon

A70A
17985.
MJIR YE3 A70A
MJIR
YE3
19955.

Google

935.
935.