Human Generated Data

Title

Untitled (children at table drinking juice)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17812

Human Generated Data

Title

Untitled (children at table drinking juice)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.8
Human 99.3
Person 99.3
Chair 98.7
Person 97.8
Chair 95.9
Restaurant 93.9
Blonde 93.8
Girl 93.8
Woman 93.8
Female 93.8
Teen 93.8
Kid 93.8
Child 93.8
Dining Table 83.4
Table 83.4
Sitting 72.7
Food 72.7
Cafeteria 71.6
Cafe 68.2
Finger 61.7
Eating 58.8
Meal 58.2
Food Court 55

Imagga
created on 2022-02-26

adult 27.5
person 26.5
man 26.2
musical instrument 25.6
people 24
wind instrument 20.7
male 19.8
brass 19.1
portrait 18.1
device 17.6
attractive 14.7
microphone 14.6
model 14
sexy 13.6
fashion 13.6
human 13.5
looking 12.8
face 12.8
body 12
happy 11.3
hair 11.1
casual 11
lifestyle 10.8
black 10.8
cornet 10.5
health 10.4
work 10.4
men 10.3
stringed instrument 10.3
smiling 10.1
music 10
patient 9.8
medical 9.7
sitting 9.4
senior 9.4
old 9
business 8.5
pretty 8.4
professional 8.4
instrument 8.3
harmonica 8.3
holding 8.2
blond 8.2
sport 8.2
care 8.2
lady 8.1
worker 8.1
handsome 8
working 7.9
boy 7.8
expression 7.7
studio 7.6
head 7.5
fun 7.5
training 7.4
banjo 7.4
chair 7.3
cheerful 7.3
child 7.3
make 7.3
case 7.2
smile 7.1
family 7.1
equipment 7.1
businessman 7.1

Google
created on 2022-02-26

Table 95.5
Standing 86.4
Black-and-white 84.6
Chair 84.4
Style 84
Tableware 77.3
Monochrome 75.8
Monochrome photography 73.9
Window 72.6
Art 70.4
Sitting 68.9
Coffee table 68.8
Room 66.9
Glass 66
Child 65.8
Visual arts 65.8
Toddler 63.9
Desk 63.7
Eyewear 63.7
Still life photography 62.5

Microsoft
created on 2022-02-26

furniture 97.8
table 97.7
text 96.8
chair 88.2
person 68.7
clothing 61.7

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 98.3%
Calm 84%
Surprised 13.8%
Confused 0.5%
Angry 0.4%
Sad 0.4%
Disgusted 0.3%
Fear 0.3%
Happy 0.3%

AWS Rekognition

Age 23-33
Gender Male, 86%
Calm 91.3%
Sad 3.3%
Surprised 1.9%
Happy 1.2%
Confused 0.9%
Disgusted 0.6%
Angry 0.6%
Fear 0.2%

Feature analysis

Amazon

Person 99.3%
Chair 98.7%

Captions

Microsoft

a group of people in front of a mirror posing for the camera 86.4%
a group of people sitting in front of a mirror posing for the camera 80.3%
a group of people standing in front of a mirror posing for the camera 80.2%

Text analysis

Amazon

13
asi
УТЗАS-A

Google

13
13