Human Generated Data

Title

Untitled (two young costumed girls walking into crowded auditorium)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5506

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two young costumed girls walking into crowded auditorium)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.3
Person 99.3
Person 99.3
Person 99.1
Person 99
Person 98.9
Person 98.4
Person 97.2
Person 96.1
Person 95.5
Flooring 95.3
Person 93.5
Person 92.8
Person 89.5
Person 87.6
Floor 86.8
Military 85.9
Military Uniform 83.9
Person 79
People 76.1
Furniture 74.5
Person 71
Indoors 70.9
Room 70.9
Person 69.4
Army 66.3
Armored 66.3
Interior Design 61.9

Imagga
created on 2022-01-23

drawing 30.9
sketch 29.7
snow 21.6
grunge 21.3
people 19.5
silhouette 18.2
representation 17.6
man 16.8
business 16.4
urban 15.7
men 15.5
city 15
person 14
scene 13.8
winter 13.6
dirty 13.6
vintage 13.2
retro 12.3
art 12.3
male 12.1
old 11.8
design 11.8
black 10.8
decoration 10.3
adult 10.1
active 9.9
activity 9.9
outdoors 9.7
women 9.5
motion 9.4
house 9.2
human 9
life 9
weather 8.9
style 8.9
crowd 8.6
season 8.6
window 8.6
construction 8.6
poster 8.5
sport 8.5
finance 8.4
outdoor 8.4
antique 8.3
speed 8.2
fun 8.2
pattern 8.2
paint 8.2
aged 8.1
ice 8.1
suit 8.1
transportation 8.1
group 8.1
office 8
graphic 8
lifestyle 7.9
businessman 7.9
reflection 7.9
facility 7.8
architecture 7.8
space 7.8
wall 7.7
texture 7.6
walking 7.6
power 7.6
sign 7.5
landscape 7.4
park 7.4
transport 7.3
building 7.2
success 7.2
working 7.1
work 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.2
clothing 95.7
person 94.7
footwear 90.9
room 79.6
gallery 77.9
white 63.4
drawing 54.2
child 51
old 48.3
posing 39.2

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 87.2%
Calm 52.4%
Sad 32.5%
Happy 6.4%
Confused 4.8%
Angry 1.5%
Disgusted 1.1%
Surprised 0.8%
Fear 0.6%

AWS Rekognition

Age 18-26
Gender Male, 91.4%
Calm 86.4%
Sad 9.1%
Confused 2.1%
Angry 1.2%
Disgusted 0.3%
Fear 0.3%
Surprised 0.3%
Happy 0.2%

AWS Rekognition

Age 18-26
Gender Female, 99.6%
Calm 55.5%
Sad 30.4%
Happy 6.3%
Confused 5.7%
Fear 0.8%
Angry 0.7%
Surprised 0.3%
Disgusted 0.3%

AWS Rekognition

Age 41-49
Gender Female, 98.1%
Calm 59.3%
Sad 25.4%
Happy 8.6%
Confused 3.7%
Disgusted 0.9%
Angry 0.7%
Fear 0.7%
Surprised 0.6%

AWS Rekognition

Age 24-34
Gender Female, 84.5%
Calm 49.1%
Sad 34.4%
Confused 13%
Happy 1.7%
Angry 0.9%
Disgusted 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Male, 76.8%
Sad 86.4%
Calm 7%
Confused 2.2%
Fear 1.6%
Disgusted 0.8%
Happy 0.8%
Angry 0.8%
Surprised 0.4%

AWS Rekognition

Age 18-24
Gender Female, 80%
Calm 85.2%
Sad 10.7%
Confused 2.6%
Happy 0.6%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people standing in a room 80.2%
a group of people standing in front of a window 59.7%
a group of people standing in front of a building 59.6%

Text analysis

Amazon

22650
SMOKING
NO SMOKING
NO
SO SMOKING
-
6811 -
6811
YORK
EUPO

Google

SMIKING
SMOKING
NO
NO SMOKING NO SMOKING NO SMIKING 22650
22650