Human Generated Data

Title

Untitled (man with boy, playing ball)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17299

Human Generated Data

Title

Untitled (man with boy, playing ball)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17299

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.1
Human 99.1
Person 98
Clothing 88.9
Apparel 88.9
Musician 81.5
Musical Instrument 81.5
Electronics 70.2
Egg 65.5
Food 65.5
Portrait 63.3
Photography 63.3
Face 63.3
Photo 63.3
Sleeve 63.2
Sitting 62.7
Long Sleeve 61.8
Leisure Activities 60.3
Outdoors 59.5
Coat 58.4
Shirt 56.1
Brass Section 55.6

Clarifai
created on 2023-10-29

people 99.8
leader 98.4
man 98.2
monochrome 97.6
adult 97.6
two 96.8
speaker 96.7
megaphone 96.6
veil 95.1
administration 93.9
group 89.5
musician 89.2
group together 88.8
one 87.8
woman 87.7
lid 86.2
music 84.9
actor 84.4
three 83.9
wear 82.3

Imagga
created on 2022-02-26

musical instrument 26.9
person 26.3
people 26.2
adult 24.6
hat 22.9
man 21.5
male 20.5
percussion instrument 18.5
portrait 18.1
women 16.6
attractive 16.1
drum 16
device 14.8
happy 14.4
clothing 14.1
pretty 14
megaphone 12.9
fashion 12.8
black 12.6
smile 12.1
men 12
casual 11.8
smiling 11.6
youth 11.1
model 10.9
lifestyle 10.8
hand 10.6
acoustic device 10.5
fun 10.5
outdoors 10.4
sexy 10.4
style 10.4
sitting 10.3
umbrella 10.2
face 9.9
outdoor 9.9
mother 9.8
banjo 9.8
human 9.7
lady 9.7
working 9.7
one 9.7
look 9.6
couple 9.6
happiness 9.4
headdress 9.3
joy 9.2
child 9.1
old 9
dress 9
stringed instrument 8.9
posing 8.9
together 8.7
hair 8.7
work 8.7
love 8.7
culture 8.5
expression 8.5
modern 8.4
street 8.3
occupation 8.2
20s 8.2
worker 8.1
looking 8
cute 7.9
brunette 7.8
outside 7.7
traditional 7.5
vintage 7.4
holding 7.4
park 7.4
lips 7.4
business 7.3
dirty 7.2
body 7.2
cowboy hat 7.1
summer 7.1
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.8
person 97
black and white 92.3
man 90.6
clothing 88.7
text 71
megaphone 27.9
crowd 1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 99.7%
Calm 97.7%
Sad 1.2%
Confused 0.3%
Angry 0.3%
Surprised 0.3%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 89.7%
Calm 100%
Happy 0%
Sad 0%
Disgusted 0%
Confused 0%
Surprised 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Egg
Person 99.1%
Person 98%
Egg 65.5%

Captions