Human Generated Data

Title

Untitled (girl feeding young boy with a spoon at small table)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4420

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (girl feeding young boy with a spoon at small table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 96
Human 96
Person 89.9
Room 82.5
Indoors 82.5
Furniture 72.1
Female 72.1
Text 66.9
Girl 66.3
Chair 63
People 61.9
Kid 61
Child 61
Restaurant 57.9

Imagga
created on 2022-01-23

blackboard 48.6
classroom 30.7
person 30.1
education 29.4
student 27.5
people 26.8
man 25.6
male 25.5
school 24.7
teacher 23.5
business 21.2
class 21.2
adult 20.1
newspaper 19.3
businessman 18.5
chalkboard 17.6
looking 16.8
hand 16.7
product 16.4
college 16.1
study 15.8
portrait 15.5
board 15.4
creation 14.5
happy 14.4
human 14.2
work 14.1
teaching 13.6
senior 13.1
success 12.9
finance 12.7
technology 12.6
chart 12.4
room 12.3
writing 12.3
group 12.1
computer 12
desk 11.9
professional 11.5
job 11.5
office 11.4
learn 11.3
drawing 11.3
men 11.2
billboard 10.9
smile 10.7
diagram 10.5
exam 10.5
old 10.4
money 10.2
indoor 10
formula 9.8
modern 9.8
math 9.8
financial 9.8
lesson 9.7
teach 9.7
to 9.7
knowledge 9.6
university 9.6
standing 9.6
plan 9.4
specialist 9.3
letter 9.3
nurse 9.2
laptop 9.1
sign 9
team 9
one 9
science 8.9
signboard 8.8
mathematics 8.8
book 8.7
smiling 8.7
card 8.5
smart 8.5
casual 8.5
world 8.3
occupation 8.2
lady 8.1
symbol 8.1
market 8
daily 7.9
postmark 7.9
design 7.9
boy 7.8
educate 7.8
black 7.8
child 7.8
stamp 7.7
sitting 7.7
studying 7.7
mail 7.7
thinking 7.6
holding 7.4
dollar 7.4
case 7.4
glasses 7.4
aged 7.2
idea 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.9
person 89.5
clothing 81.9
human face 76.7
black and white 65.6

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 79%
Calm 45.3%
Surprised 30%
Happy 11.1%
Angry 5%
Disgusted 3%
Fear 2.1%
Sad 2%
Confused 1.4%

AWS Rekognition

Age 38-46
Gender Female, 98.8%
Calm 98.5%
Sad 0.7%
Happy 0.3%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96%
Chair 63%

Captions

Microsoft

text 72.1%

Text analysis

Amazon

SOUP
BLE SOUP
BLE
AND
GREENS
AND PINEAPPLE
LIVER
CR
PINEAPPLE
ES
EAS
10
AND LIVER
10 30
30
APPLESAUCE
CH
D GREENS
17275A.
15 APPLESAUCE
6
15
D
C
BC
VI33A2
AOOM

Google

A.
7215A. EAS ER GREENS BLE SOUP D APPLES AUÇE CR AND PINEAPPLE ES IND LIVER 7275 A. 10 30
GREENS
30
EAS
ER
ES
7215A.
SOUP
AUÇE
PINEAPPLE
LIVER
BLE
D
APPLES
CR
AND
IND
7275
10