Human Generated Data

Title

Untitled (three women standing in front of tree)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1276

Human Generated Data

Title

Untitled (three women standing in front of tree)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1276

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99.9
Apparel 99.9
Person 98.8
Human 98.8
Overcoat 98
Coat 98
Wheel 97.2
Machine 97.2
Person 96.3
Tie 91.9
Accessories 91.9
Accessory 91.9
Person 91
Plant 90.9
Tree 90.7
Face 88.4
Jacket 80.2
Tuxedo 77.8
Hat 71.2
Female 68.8
People 67.6
Transportation 66.5
Vehicle 66.5
Automobile 66.5
Car 66.5
Portrait 62.7
Photography 62.7
Photo 62.7
Suit 61.4
Blazer 59

Clarifai
created on 2023-10-26

people 100
portrait 99.6
group 99.4
adult 98.9
man 98.2
two 97.9
three 97.3
administration 97.2
group together 97.2
wear 97.1
leader 96.7
several 95.8
four 95.7
five 91.5
street 90.9
woman 90
monochrome 89.8
offspring 89.7
veil 89.4
outerwear 89

Imagga
created on 2022-01-22

man 32.9
person 30.5
people 28.4
male 27.1
adult 26
portrait 23.9
attractive 22.4
happy 21.9
fashion 21.9
kin 21.1
standing 20
business 18.8
smile 18.5
couple 17.4
smiling 17.4
businessman 16.8
clothing 16.1
suit 15.8
old 14.6
dress 14.5
handsome 14.3
black 14
professional 13.7
corporate 13.7
pretty 13.3
group 12.9
outside 12.8
world 12.7
lifestyle 12.3
brunette 12.2
mother 12
outdoors 11.9
casual 11.9
jacket 11.5
together 11.4
face 11.4
men 11.2
executive 11.1
women 11.1
love 11
work 11
businesswoman 10.9
family 10.7
job 10.6
building 10.4
model 10.1
20s 10.1
groom 10
confident 10
city 10
worker 9.9
one 9.7
ethnic 9.5
tie 9.5
wall 9.4
happiness 9.4
clothes 9.4
cute 9.3
elegance 9.2
girls 9.1
stylish 9
style 8.9
looking 8.8
urban 8.7
youth 8.5
mature 8.4
successful 8.2
lady 8.1
success 8
sexy 8
office 8
cool 8
life 7.8
30s 7.7
winter 7.7
walk 7.6
park 7.4
friendly 7.3
cheerful 7.3
teenager 7.3
team 7.2
day 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

outdoor 99.7
person 98.8
clothing 97
standing 96.4
posing 92.4
text 92.3
smile 90.2
man 86.5
human face 84.6
black 74.4
car 71.3
black and white 65.3
vehicle 54.2
land vehicle 54.1
clothes 27.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Female, 97.1%
Calm 72.8%
Happy 19.7%
Surprised 1.7%
Confused 1.5%
Disgusted 1.4%
Sad 1.1%
Fear 1.1%
Angry 0.7%

AWS Rekognition

Age 73-83
Gender Female, 100%
Calm 69.4%
Happy 20.9%
Sad 2.8%
Surprised 2.3%
Angry 1.5%
Disgusted 1.1%
Confused 1.1%
Fear 0.9%

AWS Rekognition

Age 45-53
Gender Female, 100%
Calm 97.5%
Happy 0.7%
Confused 0.5%
Surprised 0.4%
Sad 0.4%
Disgusted 0.2%
Angry 0.2%
Fear 0.2%

Microsoft Cognitive Services

Age 67
Gender Female

Microsoft Cognitive Services

Age 39
Gender Female

Microsoft Cognitive Services

Age 55
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Wheel 97.2%
Tie 91.9%
Hat 71.2%
Car 66.5%
Suit 61.4%

Categories