Human Generated Data

Title

Untitled (Catholic wedding ceremony, West Allis, Wisconsin)

Date

1955, printed later

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.968

Human Generated Data

Title

Untitled (Catholic wedding ceremony, West Allis, Wisconsin)

People

Artist: John Deusing, American active 1940s

Date

1955, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.1
Human 99.1
Person 97.8
Worker 91.2
Apparel 87.7
Clothing 87.7
Hairdresser 75.8
Person 74
Flooring 71.2
Indoors 60
Floor 56.1
Dressing Room 55.2
Room 55.2

Imagga
created on 2021-12-14

chair 32.7
interior 28.3
furniture 24.9
room 24
home 19.1
table 18.5
architecture 18.2
seat 17.1
window 17.1
indoors 16.7
house 16
modern 15.4
people 15.1
floor 14.9
luxury 13.7
women 13.4
living 13.3
inside 12.9
men 12.9
man 12.8
decor 12.4
shop 12
design 11.9
decoration 11.1
indoor 10.9
building 10.9
sofa 10.7
family 10.7
urban 10.5
lamp 10.5
scene 10.4
style 10.4
barber chair 10.3
museum 10.2
salon 10.2
travel 9.9
mercantile establishment 9.6
residential 9.6
person 9.6
barbershop 9.5
sitting 9.4
male 9.2
wood 9.2
restaurant 9.1
adult 8.8
lighting 8.7
light 8.7
glass 8.6
comfortable 8.6
estate 8.5
traditional 8.3
life 8
structure 8
facility 7.8
elegant 7.7
wall 7.7
old 7.7
real 7.6
relax 7.6
elegance 7.6
depository 7.5
place of business 7.5
city 7.5
tourism 7.4
carousel 7.3
business 7.3
dress 7.2
transportation 7.2
love 7.1
hospital 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.6
wedding dress 96.9
indoor 91.5
bride 90.8
dress 83.4
clothing 82.7
woman 78.9
person 76.7
black and white 65.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 39-57
Gender Female, 63.4%
Sad 74.8%
Calm 19.4%
Happy 4.8%
Confused 0.3%
Angry 0.3%
Surprised 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 3-9
Gender Male, 52.1%
Fear 85.6%
Angry 10.3%
Sad 3.1%
Calm 0.7%
Surprised 0.2%
Confused 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 16-28
Gender Female, 67.5%
Calm 61.8%
Sad 35.6%
Angry 1.2%
Confused 1%
Surprised 0.2%
Fear 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 22-34
Gender Male, 96.3%
Calm 90.6%
Sad 5%
Angry 3.6%
Surprised 0.3%
Disgusted 0.2%
Happy 0.2%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 20-32
Gender Male, 89.1%
Calm 98.4%
Sad 1.3%
Angry 0.1%
Happy 0.1%
Surprised 0%
Confused 0%
Fear 0%
Disgusted 0%

Microsoft Cognitive Services

Age 25
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a person sitting next to a window 53.8%
a person sitting in a room 53.7%
a person sitting on a bench next to a window 45%

Text analysis

Amazon

65%
97
P109

Google

K 65%
K
65%