Human Generated Data

Title

Untitled (two bride and groom couples posed with all their parents on stage decorated with foliage)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9387

Human Generated Data

Title

Untitled (two bride and groom couples posed with all their parents on stage decorated with foliage)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9387

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.2
Human 99.2
Person 99.1
Person 99
Person 98.9
Person 98.8
Person 98.6
Person 98.2
Person 96.7
Person 93.7
Clothing 88.8
Apparel 88.8
Plant 83.7
Art 79.3
Flower 78.4
Blossom 78.4
People 72.6
Flower Arrangement 67
Wedding 62.8
Drawing 62.7
Person 61.3
Person 58.3
Flower Bouquet 58.2
Funeral 57.5
Gown 55.5
Fashion 55.5
Robe 55.2

Clarifai
created on 2023-10-26

people 99.9
group 98.9
adult 98.1
many 97.7
outfit 96.2
group together 96
wear 95.5
administration 94.7
leader 94.5
woman 94.4
several 93.5
man 92.5
royalty 90.7
monarch 90.6
princess 87.8
veil 87.5
actress 87.2
prince 86.7
child 82.3
illustration 81.8

Imagga
created on 2022-01-23

bass 64.2
brass 34.7
wind instrument 27
musical instrument 21.2
city 16.6
drawing 15.9
silhouette 15.7
sketch 14.9
design 14.6
grunge 14.5
people 12.8
black 12.7
business 12.7
urban 12.2
building 12.2
man 11.5
symbol 11.4
cornet 11
architecture 10.9
old 10.4
sky 10.2
light 10
person 9.8
night 9.8
group 9.7
male 9.2
art 8.8
graphic 8.8
sign 8.3
vintage 8.3
paint 8.1
cemetery 8.1
couple 7.8
travel 7.7
wall 7.7
crowd 7.7
finance 7.6
pattern 7.5
technology 7.4
dirty 7.2
currency 7.2
celebration 7.2
tower 7.2
history 7.1
financial 7.1
businessman 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99
vase 73.2
black and white 64.9
posing 39.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 97.9%
Confused 39.9%
Calm 31.4%
Sad 23.7%
Happy 1.9%
Surprised 1.3%
Fear 0.7%
Disgusted 0.7%
Angry 0.5%

AWS Rekognition

Age 33-41
Gender Male, 85.1%
Confused 83.1%
Sad 7%
Calm 5%
Surprised 2.3%
Disgusted 1.3%
Happy 0.6%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 39-47
Gender Female, 98.9%
Sad 66.9%
Calm 9.8%
Disgusted 7.6%
Happy 6.2%
Confused 3.7%
Angry 2.3%
Fear 1.8%
Surprised 1.7%

AWS Rekognition

Age 48-54
Gender Male, 99.7%
Confused 94.7%
Sad 2.7%
Calm 0.7%
Fear 0.6%
Happy 0.4%
Surprised 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 42-50
Gender Female, 71.1%
Confused 73.1%
Happy 15%
Calm 5.9%
Sad 3.9%
Angry 0.6%
Surprised 0.6%
Fear 0.5%
Disgusted 0.4%

AWS Rekognition

Age 23-33
Gender Male, 96.8%
Confused 55.5%
Calm 18.9%
Surprised 14.5%
Disgusted 3.9%
Sad 3.3%
Happy 1.6%
Angry 1.2%
Fear 1%

AWS Rekognition

Age 40-48
Gender Male, 99.1%
Calm 96.1%
Happy 2%
Fear 1.6%
Sad 0.2%
Confused 0.1%
Disgusted 0%
Angry 0%
Surprised 0%

AWS Rekognition

Age 27-37
Gender Female, 98.1%
Sad 81.1%
Happy 13.3%
Confused 1.9%
Calm 1.5%
Disgusted 0.7%
Angry 0.6%
Fear 0.5%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Text analysis

Amazon

36
9-101
ISS