Human Generated Data

Title

Untitled (newlywed couple leaving house as woman throws rice on them)

Date

1930-1945

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10239

Human Generated Data

Title

Untitled (newlywed couple leaving house as woman throws rice on them)

People

Artist: Martin Schweig, American 20th century

Date

1930-1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10239

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99.2
Apparel 99.2
Person 99.1
Human 99.1
Person 98.4
Person 94.8
Door 88.7
Person 83.7
Overcoat 77.9
Coat 77.9
Fashion 73.3
Robe 72.9
Gown 72.5
Evening Dress 72.5
Plant 70.8
Suit 68.6
Face 64
Tuxedo 59.1
Flower 55.5
Blossom 55.5
Wedding 55.2

Clarifai
created on 2023-10-26

people 99.9
woman 97.5
two 96.4
family 95.8
group 95.6
child 94.8
man 93.4
three 93
adult 92.4
monochrome 92.3
portrait 90.7
street 90.5
wedding 89.5
elderly 89.4
four 88.8
wear 88.4
administration 85.6
leader 85.6
group together 80.6
offspring 79

Imagga
created on 2022-01-22

man 27.5
male 25
people 23.4
person 21.6
couple 20.9
groom 20.2
bow tie 19.7
clothing 18.9
portrait 18.1
adult 17.5
black 15.8
necktie 15.6
love 13.4
suit 13.4
military uniform 13.3
happy 13.2
business 12.1
world 12.1
bride 11.8
holding 11.6
fashion 11.3
men 11.2
uniform 11.1
two 11
garment 10.7
outdoors 10.4
women 10.3
happiness 10.2
family 9.8
businessman 9.7
style 9.6
life 9.6
home 9.6
standing 9.6
walking 9.5
smile 9.3
wedding 9.2
silhouette 9.1
attractive 9.1
old 9.1
professional 8.9
together 8.8
work 8.6
future 8.4
vintage 8.3
park 8.2
romance 8
day 7.8
corporate 7.7
youth 7.7
walk 7.6
human 7.5
office 7.5
alone 7.3
lady 7.3
new 7.3
danger 7.3
dress 7.2
looking 7.2
body 7.2
romantic 7.1
job 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.6
clothing 95.2
person 89.3
dress 85.8
old 83.9
woman 79.8
standing 79.3
black 75.9
posing 73.3
white 64
man 52
vintage 37.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-44
Gender Female, 99.4%
Happy 99.9%
Surprised 0.1%
Fear 0%
Angry 0%
Calm 0%
Sad 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 35-43
Gender Female, 100%
Calm 98%
Angry 0.8%
Sad 0.4%
Disgusted 0.3%
Fear 0.2%
Surprised 0.2%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 27-37
Gender Female, 99.5%
Happy 97.8%
Calm 1.1%
Angry 0.2%
Sad 0.2%
Confused 0.2%
Disgusted 0.2%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Female, 87.5%
Calm 54.6%
Surprised 13.1%
Sad 10%
Fear 9%
Disgusted 4.8%
Angry 3.5%
Happy 2.6%
Confused 2.2%

Microsoft Cognitive Services

Age 54
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Categories

Text analysis

Amazon

PROOF
SAINT
MARTIN
SAINT LOUIS
LOUIS
MARTIN SCHWEIG
SCHWEIG

Google

PROOF MARTIN SCHWEIG SAINT LOUIS
PROOF
MARTIN
SCHWEIG
SAINT
LOUIS