Human Generated Data

Title

Untitled (two children in high chairs, one feeding the other)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16305

Human Generated Data

Title

Untitled (two children in high chairs, one feeding the other)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Restaurant 98
Chair 97.5
Furniture 97.5
Person 97.5
Human 97.5
Person 90.3
Table 86
Cafeteria 85.7
Food 85.4
Meal 85.4
Dining Table 82.7
Sitting 75.2
Cafe 70.3
Indoors 65.9
Room 64.8
Dish 61.3
Flooring 58.6
Tabletop 56.7
Dining Room 56.7

Imagga
created on 2022-02-11

classroom 68.8
room 65
man 33.6
chair 28
brass 26.3
male 25.6
person 24.1
people 22.3
wind instrument 21.5
adult 21.3
blackboard 20.9
musical instrument 19
sitting 18.9
lifestyle 18.8
cornet 18.5
indoors 17.6
home 17.5
table 16.6
women 16.6
couple 16.5
men 16.3
smiling 15.9
casual 14.4
senior 14
teacher 13.8
happy 13.1
portrait 12.9
group 12.9
interior 12.4
education 12.1
school 12
music 11.9
happiness 11.7
holding 11.5
oboe 11.5
black 11.4
cheerful 11.4
student 11
seat 10.7
handsome 10.7
class 10.6
instrument 10.5
two 10.2
indoor 10
dress 9.9
device 9.8
together 9.6
play 9.5
furniture 9.4
mature 9.3
smile 9.3
equipment 9.1
modern 9.1
business 9.1
fun 9
family 8.9
job 8.8
teaching 8.8
child 8.7
love 8.7
musical 8.6
elderly 8.6
youth 8.5
enjoyment 8.4
old 8.4
hand 8.3
board 8.1
musician 8
businessman 7.9
boy 7.8
hands 7.8
son 7.8
concert 7.8
couch 7.7
desk 7.7
attractive 7.7
sit 7.6
communication 7.5
togetherness 7.5
training 7.4
phone 7.4
office 7.4
sexy 7.2
computer 7.2

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 87
furniture 86.4
chair 72.1
person 70.8
table 65.6

Face analysis

Amazon

Google

AWS Rekognition

Age 6-16
Gender Female, 97.8%
Calm 87.8%
Happy 6.9%
Fear 2%
Surprised 1.1%
Sad 0.8%
Angry 0.8%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 45-51
Gender Male, 87.5%
Calm 71.4%
Sad 26.2%
Confused 0.7%
Angry 0.7%
Surprised 0.5%
Disgusted 0.2%
Fear 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%

Captions

Microsoft

a person sitting in front of a window 54.3%
a man and woman sitting next to a window 31.4%
a man and a woman sitting in front of a window 31.3%

Text analysis

Amazon

KODVK
1948
SG
and SG
and