Human Generated Data

Title

Untitled (girls at table having tea party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17677

Human Generated Data

Title

Untitled (girls at table having tea party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17677

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99
Human 99
Person 98.8
Meal 95.4
Food 95.4
Person 94.6
Blonde 90.6
Female 90.6
Girl 90.6
Kid 90.6
Teen 90.6
Woman 90.6
Child 90.6
Dish 86.7
Tabletop 85
Furniture 85
Clothing 84.5
Apparel 84.5
Table 81.7
Sitting 81.4
Plant 81.3
Restaurant 79.6
Tree 78
Sweets 78
Confectionery 78
Face 77.8
Dining Table 75.8
Vegetation 66.4
Cafeteria 65.7
Finger 62.2
Icing 58.9
Dessert 58.9
Cake 58.9
Cream 58.9
Creme 58.9

Clarifai
created on 2023-10-29

people 99.9
monochrome 98.5
woman 98.3
child 98
adult 97.3
group 96.5
man 94.8
two 91.4
group together 90.5
mirror 90.3
recreation 88.5
administration 87.4
vehicle 86.6
three 86.6
wear 85.4
indoors 84.4
music 84.3
boy 83.4
actress 82.9
four 82.3

Imagga
created on 2022-02-26

man 29.5
tub 29.1
vessel 26.3
happy 25.6
people 25.6
male 25.6
adult 23.9
child 22.5
kin 19.7
happiness 18.8
fun 18.7
couple 18.3
person 18.2
smiling 16.6
senior 15.9
lifestyle 15.9
sitting 15.4
outdoors 15
family 14.2
together 13.1
playing 12.7
leisure 12.4
kid 12.4
summer 12.2
smile 12.1
children 11.8
portrait 11.6
home 11.1
mature 11.1
love 11
two 11
childhood 10.7
face 10.6
vehicle 10.5
group 10.5
men 10.3
women 10.3
relaxation 10
joy 10
outdoor 9.9
holding 9.9
vacation 9.8
park 9.7
husband 9.5
play 9.5
wife 9.5
friends 9.4
casual 9.3
hospital 9
active 9
technology 8.9
grandfather 8.9
interior 8.8
computer 8.8
little 8.8
indoors 8.8
chair 8.7
outside 8.5
wheel 8.5
cheerful 8.1
handsome 8
holiday 7.9
day 7.8
their 7.8
education 7.8
youth 7.7
car 7.6
beach 7.6
laughing 7.5
communication 7.5
musical instrument 7.5
enjoyment 7.5
human 7.5
water 7.3
relaxing 7.3
color 7.2
blond 7.2
work 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 99.1
woman 92.1
black and white 91.4
human face 84.3
window 82.7
dish 75.1
mirror 72.3
text 60.3
table 53.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-22
Gender Female, 91.2%
Calm 61.9%
Sad 20.1%
Happy 7.5%
Surprised 2.3%
Angry 2.3%
Disgusted 2.3%
Confused 1.8%
Fear 1.6%

AWS Rekognition

Age 23-33
Gender Female, 99.3%
Surprised 48.1%
Sad 35.8%
Calm 10.1%
Confused 1.9%
Fear 1.2%
Disgusted 1.1%
Angry 1.1%
Happy 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99%
Person 98.8%
Person 94.6%

Categories

Text analysis

Amazon

12
-2.VLE1X