Human Generated Data

Title

Untitled (four women in dresses)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19194

Human Generated Data

Title

Untitled (four women in dresses)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19194

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.2
Human 99.2
Person 98.6
Person 98.5
Person 97.4
Clothing 95.4
Apparel 95.4
Shop 85.6
Flooring 78.2
Floor 76.1
Boutique 70
Door 68.5
Evening Dress 68.4
Fashion 68.4
Gown 68.4
Robe 68.4
Overcoat 65.5
Coat 65.5
Text 65.5
Shorts 60.5
Symbol 59.9
Sleeve 59.3
Suit 58.5
Long Sleeve 56.7

Clarifai
created on 2023-10-22

people 99.9
man 98.6
woman 98.4
wedding 97.6
adult 97.4
wear 97
two 95.2
portrait 94.6
family 94.2
group 93.3
dress 91.4
bride 91.4
facial expression 90.4
groom 89.8
four 89.4
three 88.7
actress 87.4
indoors 84.6
affection 83.8
offspring 83.3

Imagga
created on 2022-02-25

shop 36.2
man 31.6
barbershop 30.6
groom 30.2
male 24.9
people 24
adult 24
mercantile establishment 21.4
business 21.3
building 21
women 19
couple 18.3
happy 18.2
person 18.1
office 17.9
corporate 17.2
men 15.5
dress 14.5
place of business 14.3
smile 14.3
businessman 14.1
two 13.6
happiness 13.3
smiling 13
pretty 12.6
work 12.6
room 12.3
indoors 12.3
together 12.3
attractive 11.9
businesswoman 11.8
portrait 11.6
family 11.6
lifestyle 11.6
bride 11.5
professional 11.4
executive 11.3
fashion 11.3
standing 11.3
love 11
wedding 11
cheerful 10.6
modern 10.5
group 10.5
outdoors 10.4
sitting 10.3
looking 9.6
store 9.4
meeting 9.4
city 9.1
call 9.1
holding 9.1
suit 9
job 8.8
home 8.8
life 8.6
boutique 8.5
worker 8.4
communication 8.4
house 8.4
indoor 8.2
team 8.1
nurse 8
door 8
interior 8
black 7.9
day 7.8
wall 7.8
boss 7.7
formal 7.6
casual 7.6
window 7.6
clothes 7.5
style 7.4
sale 7.4
shopping 7.3
center 7.3
lady 7.3
success 7.2
establishment 7.2
working 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.4
dress 94
clothing 93.4
person 90.9
woman 87.4
posing 87.3
picture frame 7.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 99.9%
Happy 99.5%
Surprised 0.2%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%
Sad 0%
Confused 0%
Calm 0%

AWS Rekognition

Age 19-27
Gender Female, 51.4%
Sad 66.3%
Disgusted 8.6%
Angry 7.2%
Happy 5.2%
Confused 4.7%
Fear 2.8%
Surprised 2.6%
Calm 2.6%

AWS Rekognition

Age 21-29
Gender Female, 99.8%
Happy 98.6%
Surprised 0.3%
Angry 0.3%
Fear 0.3%
Disgusted 0.2%
Sad 0.1%
Confused 0.1%
Calm 0.1%

AWS Rekognition

Age 22-30
Gender Female, 99.8%
Happy 75%
Fear 13.1%
Surprised 6.4%
Confused 1.5%
Sad 1.3%
Disgusted 1.2%
Angry 1.1%
Calm 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.2%
Person 98.6%
Person 98.5%
Person 97.4%

Categories

Text analysis

Amazon

DEC
64
131
so

Google

131 • 030
131
030