Human Generated Data

Title

Untitled (grandfather with two children)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17219

Human Generated Data

Title

Untitled (grandfather with two children)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17219

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.9
Person 99.1
Human 99.1
Couch 98.9
Person 96.9
Clothing 95.8
Apparel 95.8
Female 90.6
Dress 86.7
Face 85.2
Home Decor 80.4
Sitting 79
Plant 78.8
Woman 75
Girl 73.6
Indoors 72.7
Blonde 70.9
Kid 70.9
Teen 70.9
Child 70.9
Potted Plant 70.9
Pottery 70.9
Vase 70.9
Jar 70.9
Leisure Activities 67
Chair 66.8
Living Room 65.6
Room 65.6
Portrait 61.9
Photography 61.9
Photo 61.9
Suit 59.5
Coat 59.5
Overcoat 59.5
Tree 56.6
Gown 56.5
Fashion 56.5
Baby 55.7

Clarifai
created on 2023-10-28

people 99.8
man 97.6
woman 96.8
adult 96.7
group 95.9
wedding 95.4
bride 94.1
monochrome 93.7
sit 93.2
two 91.5
veil 91.4
leader 88.8
groom 88.5
child 85.5
snow 85.1
group together 84.6
family 83
chair 82.7
furniture 81.5
four 79.2

Imagga
created on 2022-02-26

sax 80.9
wind instrument 34.4
man 34.2
adult 28.1
people 26.2
sitting 24.9
male 24.2
person 21
senior 20.6
brass 19.2
outdoors 19
happy 18.2
musical instrument 17.3
smiling 16.6
lifestyle 16.6
couple 16.5
women 15.8
smile 13.5
happiness 13.3
together 13.1
business 12.7
day 12.5
portrait 12.3
chair 12.1
home 12
outdoor 11.5
cornet 11.2
wheelchair 10.7
life 10.7
working 10.6
businessman 10.6
retirement 10.6
mature 10.2
care 9.9
old 9.7
computer 9.6
looking 9.6
elderly 9.6
love 9.5
enjoying 9.5
casual 9.3
park 9.3
relaxation 9.2
laptop 9.1
human 9
family 8.9
office 8.8
water 8.7
looking camera 8.6
men 8.6
two 8.5
attractive 8.4
holding 8.2
relaxing 8.2
cheerful 8.1
suit 8.1
holiday 7.9
boy 7.8
retired 7.7
luxury 7.7
summer 7.7
outside 7.7
youth 7.7
husband 7.6
beach 7.6
wife 7.6
sit 7.6
one person 7.5
house 7.5
fun 7.5
room 7.4
boat 7.4
vacation 7.4
professional 7.3
music 7.3
active 7.3
group 7.2
handsome 7.1
work 7.1
travel 7
sea 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 94.1
person 76.7
clothing 66.4
black and white 51.5
window 16.7
picture frame 9.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 98.5%
Surprised 46.2%
Sad 25.1%
Fear 11.6%
Calm 10.8%
Happy 3%
Angry 1.9%
Disgusted 0.9%
Confused 0.5%

AWS Rekognition

Age 27-37
Gender Male, 99.9%
Surprised 82.5%
Angry 12.1%
Disgusted 1.3%
Calm 1.2%
Confused 1%
Sad 0.9%
Fear 0.5%
Happy 0.4%

AWS Rekognition

Age 54-64
Gender Male, 99.9%
Calm 72.2%
Surprised 13.2%
Sad 8.7%
Fear 2.4%
Confused 1.5%
Disgusted 0.9%
Angry 0.6%
Happy 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.1%
Person 96.9%
Chair 66.8%

Categories

Text analysis

Amazon

26
SOA
MODEK
MODEK EVELLA
EVELLA

Google

26
26