Human Generated Data

Title

Untitled (two women watching boy put on prosthetic leg)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8246

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women watching boy put on prosthetic leg)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8246

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 100
Apparel 100
Person 99
Human 99
Person 98.4
Furniture 97.2
Chair 97.1
Person 96.2
Shoe 94.5
Footwear 94.5
Shoe 87.3
Person 83.5
Person 82.2
Shorts 74.3
Coat 71.6
Hat 70.2
Photography 67.2
Portrait 67.2
Face 67.2
Photo 67.2
People 66.9
Suit 66.6
Overcoat 66.6
Bonnet 63.1
Couch 59.5
Female 55.8

Clarifai
created on 2023-10-25

people 99.8
adult 98.5
group together 97.1
man 96.4
group 96
wear 94.7
woman 94.3
child 93.2
two 93.2
veil 91.9
chair 90.8
monochrome 90.8
three 90.2
uniform 88.1
four 88.1
sit 88.1
medical practitioner 87.6
furniture 85.9
five 85
several 84.9

Imagga
created on 2022-01-08

trombone 53.3
brass 51.6
wind instrument 40.6
man 32.2
musical instrument 31.3
sword 26.1
person 24.5
people 24
male 22.7
weapon 21.7
bass 17.9
adult 17.7
mask 14.8
black 14.4
sport 13.3
player 13.3
play 12.1
men 12
helmet 11.4
human 11.2
silhouette 10.8
professional 10.6
work 10.2
dark 10
ballplayer 10
hand 9.9
portrait 9.7
athlete 9.5
music 9.2
outdoor 9.2
protection 9.1
suit 9
team 9
working 8.8
lifestyle 8.7
war 8.7
device 8.6
summer 8.4
holding 8.2
playing 8.2
style 8.2
musician 8
body 8
businessman 7.9
crutch 7.9
business 7.9
art 7.8
youth 7.7
studio 7.6
relax 7.6
happy 7.5
technology 7.4
freedom 7.3
equipment 7.2
smile 7.1
worker 7.1
medical 7.1
happiness 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.3
person 90.1
clothing 73.4
black and white 72.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 90.8%
Calm 89.5%
Surprised 9.6%
Sad 0.3%
Happy 0.2%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 29-39
Gender Female, 83.9%
Calm 90.3%
Happy 4.9%
Surprised 2.7%
Sad 0.9%
Fear 0.4%
Disgusted 0.3%
Angry 0.2%
Confused 0.2%

AWS Rekognition

Age 16-24
Gender Female, 94.4%
Calm 98.5%
Surprised 0.8%
Sad 0.6%
Disgusted 0%
Angry 0%
Happy 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Chair 97.1%
Shoe 94.5%

Categories

Captions

Text analysis

Amazon

7659
MJI7
7659.
MJI7 YE3A А70A
YE3A
А70A

Google

7659
A
7659.
7659 MJ13 YT33A2 A 7659. 7659.
MJ13
YT33A2