Human Generated Data

Title

Untitled (grandfather with two children)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17220

Human Generated Data

Title

Untitled (grandfather with two children)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17220

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 100
Person 99.5
Human 99.5
Person 98.6
Clothing 98.3
Apparel 98.3
Chair 96.2
Couch 92.4
Sitting 88.5
Female 88.2
Face 83.1
Dress 82.4
Tie 80
Accessories 80
Accessory 80
Woman 75.5
Gown 68.4
Fashion 68.4
Art 67.7
Girl 66.6
Table 66.1
Portrait 65.6
Photography 65.6
Photo 65.6
Flower 65.2
Blossom 65.2
Plant 65.2
Robe 63.3
Vase 62.7
Pottery 62.7
Jar 62.7
Dining Table 59.3
Suit 59
Coat 59
Overcoat 59
Floor 58.4
Leisure Activities 58.1
Indoors 57.7
Potted Plant 57.2
Drawing 55.6
Evening Dress 55.6

Clarifai
created on 2023-10-29

people 99.8
child 98.3
woman 96.6
man 96.4
group 96.3
family 96.3
monochrome 95.8
indoors 92.2
adult 92.1
sit 92
wedding 88.4
two 85.1
leader 83.4
bride 82.7
interaction 81.9
four 80.9
chair 79.5
education 78.7
veil 78.1
enjoyment 77.4

Imagga
created on 2022-02-26

man 32.9
male 25.6
adult 25
person 24
people 22.9
sitting 19.7
business 19.4
groom 18.9
musical instrument 18.6
businessman 18.5
women 18.2
lifestyle 18.1
wind instrument 17.2
couple 16.5
happy 16.3
teacher 15.2
corporate 14.6
men 14.6
office 14.5
smiling 13.7
professional 13.5
happiness 13.3
smile 12.8
meeting 12.2
life 12
black 12
portrait 11.6
indoors 11.4
executive 10.9
chair 10.9
bowed stringed instrument 10.7
stringed instrument 10.4
room 10.3
love 10.3
two 10.2
communication 10.1
sax 10.1
suit 9.9
team 9.8
group 9.7
together 9.6
home 9.6
work 9.4
enjoyment 9.4
casual 9.3
future 9.3
educator 9.3
brass 9.3
modern 9.1
attractive 9.1
businesswoman 9.1
clothing 8.8
bride 8.8
boss 8.6
relaxation 8.4
human 8.2
worker 8.1
sexy 8
job 8
hair 7.9
color 7.8
40s 7.8
woodwind 7.6
talking 7.6
businesspeople 7.6
career 7.6
togetherness 7.6
manager 7.4
phone 7.4
building 7.4
back 7.3
alone 7.3
indoor 7.3
success 7.2
looking 7.2
body 7.2
family 7.1
face 7.1
cornet 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.2
wedding 87.3
window 80.7
wedding dress 80.2
clothing 78.9
person 77.1
ceremony 61.5
bride 58.5
old 55
picture frame 15.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 56-64
Gender Male, 99.5%
Sad 72.5%
Calm 14.7%
Confused 6.2%
Happy 3.1%
Disgusted 1.1%
Angry 1%
Surprised 0.9%
Fear 0.5%

AWS Rekognition

Age 26-36
Gender Male, 99.6%
Surprised 86%
Angry 4.5%
Calm 3.5%
Fear 2.1%
Disgusted 1.6%
Happy 1%
Sad 1%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Tie
Person 99.5%
Person 98.6%
Chair 96.2%
Tie 80%

Categories

Text analysis

Amazon

SOA
VTDD