Human Generated Data

Title

Untitled (women at piano and singing)

Date

1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1647

Human Generated Data

Title

Untitled (women at piano and singing)

People

Artist: John Deusing, American active 1940s

Date

1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1647

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.8
Apparel 99.8
Person 99.5
Human 99.5
Female 96.9
Person 95.9
Woman 91.1
Dress 89.1
Gown 86.6
Fashion 86.6
Robe 86.2
Evening Dress 74.7
Wedding 74.3
Person 73
Person 71.2
Person 70.1
Face 70
Photography 68.5
Photo 68.5
Portrait 68.2
Wedding Gown 66.8
Plant 60.9
Girl 58.2
Bride 56.3
Sitting 55.8
Suit 55.6
Coat 55.6
Overcoat 55.6
Person 44.1

Clarifai
created on 2023-10-25

people 99.9
group 98.9
woman 98.8
adult 98.4
man 96.4
monochrome 92.5
furniture 91
indoors 89.1
room 88.9
wedding 88.9
veil 88.7
family 85.3
music 84.1
child 83.6
leader 83.3
bride 82.9
many 82.8
two 82.5
wear 81.9
several 81.2

Imagga
created on 2021-12-14

home 30.3
people 27.9
interior 26.5
shop 26.1
room 25.2
man 24.3
person 23.8
adult 23.3
house 22.6
male 19.9
indoors 19.3
modern 18.2
happy 18.2
indoor 17.3
mercantile establishment 17.1
professional 16.4
table 15.1
men 14.6
business 14.6
work 14.2
barbershop 14.2
furniture 13.7
worker 13.4
women 12.6
office 12.5
apartment 12.4
businessman 12.4
couple 12.2
desk 11.9
day 11.8
chair 11.6
working 11.5
place of business 11.4
salon 11.2
corporate 11.2
smiling 10.8
window 10.7
decor 10.6
new 10.5
looking 10.4
clothing 10.3
laptop 10.1
lifestyle 10.1
smile 10
kitchen 10
family 9.8
computer 9.6
luxury 9.4
happiness 9.4
horizontal 9.2
inside 9.2
portrait 9.1
decoration 9
groom 9
teacher 8.9
technology 8.9
celebration 8.8
moving 8.6
executive 8.6
businesspeople 8.5
space 8.5
living 8.5
bakery 8.5
domestic 8.5
casual 8.5
floor 8.4
fashion 8.3
holding 8.2
human 8.2
20s 8.2
businesswoman 8.2
job 8
medical 7.9
standing 7.8
boxes 7.8
sitting 7.7
attractive 7.7
two 7.6
meeting 7.5
light 7.4
negative 7.3
alone 7.3
box 7.3
group 7.3
boutique 7.3
success 7.2
copy space 7.2
black 7.2
team 7.2
mother 7.1
handsome 7.1
idea 7.1
together 7
life 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.6
window 94.5
dress 94
wedding dress 93.8
clothing 89.8
indoor 86.7
woman 86.6
person 82.4
bride 74.1
old 67.4
image 30.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Male, 93%
Calm 90.4%
Happy 4.3%
Angry 2.6%
Sad 1%
Disgusted 0.7%
Confused 0.4%
Surprised 0.4%
Fear 0.1%

AWS Rekognition

Age 32-48
Gender Female, 54.6%
Calm 70.4%
Sad 15.4%
Confused 7.7%
Surprised 4.8%
Fear 0.6%
Angry 0.5%
Happy 0.5%
Disgusted 0.2%

AWS Rekognition

Age 40-58
Gender Female, 86.5%
Calm 77.1%
Sad 11.4%
Happy 5.5%
Angry 3.7%
Confused 1.1%
Fear 0.7%
Disgusted 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.5%

Categories