Human Generated Data

Title

Untitled ("queen and ladies" outside on steps)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19337

Human Generated Data

Title

Untitled ("queen and ladies" outside on steps)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19337

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Person 97.9
Human 97.9
Person 97.8
Person 95.2
Person 94.2
Person 93.1
Person 92.6
Fashion 89.3
Person 87.5
Person 86.1
Robe 78.1
Cloak 73.8
Evening Dress 71
Gown 71
Staircase 66.4
Female 65.6
Overcoat 59.8
Coat 59.8
Wedding 55.6
Woman 55.5

Clarifai
created on 2023-10-22

people 99.9
adult 98.2
woman 98.1
group 97.5
actress 97.2
wear 96.5
art 96.4
dress 96
portrait 94.9
wedding 94.2
princess 94.2
music 94.1
child 93.7
man 93.7
step 93.2
outfit 91.7
veil 91.3
gown (clothing) 91
costume 90.2
castle 89.9

Imagga
created on 2022-03-05

groom 26
people 24.5
window 19.2
clothing 17.9
man 16.8
city 15.8
business 15.8
black 15.6
dress 15.3
kin 14.9
adult 14.5
architecture 14.2
urban 14
shop 13.5
women 13.4
gown 13.4
old 13.2
building 13.1
travel 12.7
love 12.6
person 12.5
covering 12.1
fashion 12.1
barbershop 11.4
boutique 11.3
silhouette 10.8
male 10.6
men 10.3
indoor 10
bride 9.9
academic gown 9.9
religion 9.9
robe 9.8
mortarboard 9.6
couple 9.6
historical 9.4
wall 9.4
office 9.3
tourism 9.1
history 8.9
garment 8.8
businessman 8.8
sitting 8.6
culture 8.5
adults 8.5
street 8.3
wedding 8.3
style 8.2
interior 8
mercantile establishment 7.7
corporate 7.7
statue 7.7
married 7.7
stone 7.6
leisure 7.5
church 7.4
cap 7.4
inside 7.4
life 7.3
group 7.3
suit 7.2
home 7.2
portrait 7.1
cloak 7.1
gate 7
modern 7
together 7

Google
created on 2022-03-05

Window 89.9
Black 89.6
Stairs 87.9
Standing 86.4
Black-and-white 85.5
Style 83.9
Tints and shades 76.5
Monochrome 75.7
Monochrome photography 75.4
Beauty 75.1
Art 70.6
Room 66.4
Event 66
Stock photography 65.3
Door 64.8
Rectangle 64.1
History 63.3
Vintage clothing 61.2
Font 60.1
Sitting 59.5

Microsoft
created on 2022-03-05

building 99.9
dress 98.7
outdoor 98.5
wedding dress 98.2
clothing 93.6
bride 92.6
text 91.7
black and white 89.5
black 89.2
woman 87.5
person 87.4
white 73.7
statue 60.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 92.3%
Confused 50.8%
Calm 39.6%
Sad 3.9%
Surprised 3.4%
Happy 1.1%
Angry 0.6%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Female, 64.3%
Calm 88.7%
Happy 5.8%
Surprised 1.8%
Confused 1.3%
Sad 1.1%
Disgusted 0.8%
Angry 0.4%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Female, 95%
Calm 99%
Surprised 0.8%
Sad 0%
Fear 0%
Confused 0%
Disgusted 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 19-27
Gender Male, 99.5%
Calm 70%
Sad 9.8%
Surprised 8%
Happy 5.3%
Fear 2.3%
Angry 1.8%
Disgusted 1.5%
Confused 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Staircase
Person 97.9%
Person 97.8%
Person 95.2%
Person 94.2%
Person 93.1%
Person 92.6%
Person 87.5%
Person 86.1%
Staircase 66.4%

Categories

Text analysis

Amazon

YТ3-

Google

YT37A2-XAGOX
YT37A2-XAGOX