Human Generated Data

Title

Untitled (girl feeding seagulls at beach)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8670

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (girl feeding seagulls at beach)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Apparel 99.8
Shorts 99.8
Clothing 99.8
Person 99.4
Human 99.4
Person 96
Person 94
Bird 90.9
Animal 90.9
Bird 86.2
Bird 84.4
Person 81.8
Person 79
Bird 78.9
Crowd 71.7
Person 69.7
Female 68.7
Bird 67.3
Leisure Activities 66.9
Person 64
Person 63.4
Building 61
Stage 59.3
Person 58.3
Performer 57.7
Screen 57.4
Display 57.4
Monitor 57.4
Electronics 57.4
Woman 56.8

Imagga
created on 2022-01-09

silhouette 21.5
black 18
structure 15.4
sky 15.3
night 15.1
building 15
light 14.9
dark 14.2
man 14.1
architecture 14.1
blackboard 13.6
billboard 13.4
city 13.3
skyline 13.3
electric chair 13.1
grunge 12.8
negative 12.6
symbol 12.1
sunset 11.7
history 11.6
travel 11.3
old 11.1
television 11.1
film 11
signboard 10.9
device 10.9
instrument of execution 10.8
person 10.2
water 10
dirty 9.9
religion 9.9
landscape 9.7
urban 9.6
design 9.6
smoke 9.3
art 9.3
instrument 9.2
landmark 9
sign 9
factory 8.7
equipment 8.6
cityscape 8.5
outdoor 8.4
evening 8.4
business 7.9
player 7.9
destruction 7.8
rock 7.8
people 7.8
explosion 7.7
hand 7.6
power 7.6
vintage 7.4
famous 7.4
lights 7.4
tower 7.3
protection 7.3
industrial 7.3
horizon 7.2
shadow 7.2
moon 7.1
male 7.1

Microsoft
created on 2022-01-09

text 99.8
concert 90.7
person 78.2
black and white 77.8
display 35.3
picture frame 9.7

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 51.1%
Happy 97%
Calm 1.7%
Sad 0.5%
Angry 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 96.6%
Surprised 49.3%
Calm 17.8%
Confused 15.2%
Fear 8.6%
Happy 3.2%
Sad 2.7%
Disgusted 1.9%
Angry 1.3%

AWS Rekognition

Age 6-14
Gender Female, 69.8%
Fear 59.3%
Calm 17.6%
Happy 7%
Sad 6.5%
Surprised 5.7%
Confused 1.8%
Angry 1.1%
Disgusted 1%

AWS Rekognition

Age 18-26
Gender Male, 91.9%
Calm 68.5%
Fear 8.9%
Happy 6.2%
Sad 5.9%
Confused 3.4%
Disgusted 3%
Angry 2.6%
Surprised 1.5%

AWS Rekognition

Age 9-17
Gender Male, 80.6%
Calm 69.4%
Sad 7.7%
Happy 5.4%
Fear 5.2%
Confused 4.1%
Angry 3.5%
Surprised 3.2%
Disgusted 1.4%

AWS Rekognition

Age 16-24
Gender Female, 86.5%
Calm 42.8%
Sad 32.2%
Confused 12.3%
Disgusted 4.5%
Fear 4.4%
Happy 1.4%
Angry 1.3%
Surprised 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.4%
Bird 90.9%

Captions

Microsoft

a person standing in front of a window 61.5%
a group of people standing in front of a window 51.1%
a person standing in front of a television 51%

Text analysis

Amazon

22872.
IS
KODAKSLA

Google

22872.
22872.