Human Generated Data

Title

Untitled (man facing groom and kneeling bride while taking ring from pillow at greek wedding)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9206

Human Generated Data

Title

Untitled (man facing groom and kneeling bride while taking ring from pillow at greek wedding)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 100
Apparel 100
Human 99.7
Person 99.7
Person 99.5
Person 99.4
Person 99.1
Person 97.8
Person 97.7
Gown 95.7
Fashion 95.7
Person 95.3
Robe 95.1
Female 93.7
Wedding 92.8
Dress 92.7
Woman 83.6
Wedding Gown 82.9
Bride 82.9
Face 81.5
Person 78.9
People 77.6
Overcoat 77.5
Suit 77.5
Coat 77.5
Bridegroom 77.1
Evening Dress 70.9
Portrait 68.3
Photography 68.3
Photo 68.3
Girl 63.5
Plant 62.9
Blossom 62.7
Flower 62.7
Art 60
Performer 58.2

Imagga
created on 2022-01-23

water 19.3
man 17.5
people 16.2
bride 15.6
person 14.3
groom 13.7
adult 13.6
harp 13.2
male 12.8
dress 12.6
negative 12.1
wedding 11.9
device 11.7
city 11.6
musical instrument 11.6
travel 11.3
landscape 11.1
brass 10.9
sky 10.8
black 10.2
alone 10
wind instrument 10
river 9.8
old 9.7
business 9.7
outdoors 9.7
urban 9.6
reflection 9.5
architecture 9.5
cornet 9.4
sea 9.4
calm 9.1
newspaper 9.1
building 9
summer 9
film 8.9
women 8.7
world 8.7
product 8.4
boat 8.3
tourism 8.2
human 8.2
transportation 8.1
gown 8
businessman 7.9
marriage 7.6
vacation 7.4
light 7.3
art 7.3
coast 7.2
love 7.1
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.7
person 88.1
clothing 84.5
black and white 78.4
wedding dress 64.1
drawing 63.3
white 61
old 41.1

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Female, 88%
Sad 91.7%
Calm 2.7%
Fear 1.9%
Happy 1.1%
Disgusted 0.9%
Confused 0.8%
Angry 0.6%
Surprised 0.4%

AWS Rekognition

Age 27-37
Gender Male, 92.2%
Sad 58%
Happy 12.8%
Calm 8.4%
Angry 7.9%
Confused 4.8%
Disgusted 3.5%
Surprised 2.8%
Fear 1.8%

AWS Rekognition

Age 14-22
Gender Female, 97%
Calm 99.4%
Happy 0.3%
Sad 0.2%
Angry 0%
Confused 0%
Fear 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Sad 86.4%
Calm 6.1%
Angry 2.4%
Disgusted 1.2%
Happy 1.2%
Fear 1%
Surprised 1%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a vintage photo of a man 88.7%
a vintage photo of a group of people posing for the camera 77%
a vintage photo of a group of people posing for a picture 76.9%

Text analysis

Amazon

sa

Google

2
歌 2 2 2