Human Generated Data

Title

Untitled (two girls in matching dresses standing outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17308

Human Generated Data

Title

Untitled (two girls in matching dresses standing outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17308

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.6
Apparel 99.6
Person 99.5
Human 99.5
Person 99.4
Dress 95
Female 93.7
Face 91
Footwear 86.6
Girl 77
People 75.9
Smile 74.3
Woman 74
Plant 73.2
Grass 70.9
Outdoors 70.7
Tree 70
Shoe 69
Portrait 68.8
Photography 68.8
Photo 68.8
Boot 62.7
Kid 57.7
Child 57.7
Path 57.5

Clarifai
created on 2023-10-29

people 99.9
wear 98.5
adult 97.3
portrait 97
woman 95.3
man 95.1
two 95
leader 92.8
three 92.6
group 92.3
child 92.1
military 88.6
monochrome 87.6
four 87.6
uniform 87.4
coat 86.6
retro 86.5
outfit 86.2
facial expression 84.4
administration 82.3

Imagga
created on 2022-02-26

kin 44.9
people 25.7
fashion 24.9
person 22.9
walking 20.8
dress 19
adult 18.8
women 16.6
man 16.5
happy 15.7
portrait 15.5
attractive 15.4
street 14.7
bag 14.3
smile 14.3
city 14.1
cute 13.6
shopping 13.2
clothing 13
old 12.5
happiness 12.5
coat 12.3
couple 12.2
lady 12.2
bags 11.7
black 11.7
child 11.6
lifestyle 11.6
holding 11.6
walk 11.4
male 11.4
cheerful 11.4
urban 11.4
sexy 11.2
style 11.1
casual 11
mother 10.9
smiling 10.8
pretty 10.5
shop 10.3
sale 10.2
model 10.1
one 9.7
gift 9.5
clothes 9.4
vintage 9.1
fun 9
mall 8.8
standing 8.7
holiday 8.6
face 8.5
youth 8.5
girls 8.2
costume 8.2
present 8.2
group 8.1
hair 7.9
business 7.9
buying 7.7
winter 7.7
outdoor 7.6
snow 7.6
customer 7.6
store 7.6
elegance 7.6
human 7.5
park 7.4
pose 7.2
posing 7.1
work 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 97.5
clothing 96.8
person 92.1
text 91.4
black 88
footwear 85.9
dress 83.1
standing 76.1
smile 75.7
posing 73.9
skirt 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 73.2%
Happy 53.4%
Calm 31.5%
Surprised 6.2%
Confused 2.3%
Fear 2.2%
Sad 1.7%
Disgusted 1.4%
Angry 1.3%

AWS Rekognition

Age 26-36
Gender Male, 96.6%
Calm 61.7%
Sad 13.4%
Happy 9.9%
Fear 6.2%
Confused 5.6%
Angry 1.6%
Surprised 1%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 99.4%

Categories

Imagga

paintings art 99.7%

Text analysis

Amazon

24
ЧАСОЯ

Google

MJIA ODVK 2EE 24
MJIA
ODVK
2EE
24