Human Generated Data

Title

Untitled (overhead view of wedding banquet tables seated with wedding party and guests)

Date

1941

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10242

Human Generated Data

Title

Untitled (overhead view of wedding banquet tables seated with wedding party and guests)

People

Artist: Martin Schweig, American 20th century

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10242

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.3
Human 99.3
Person 99
Person 98.5
Person 98.4
Person 98.2
Person 97.3
Person 97
Person 96.6
Person 94.1
People 93.8
Person 89.7
Crowd 89.1
Person 89.1
Person 84.6
Clothing 81.6
Apparel 81.6
Advertisement 79.6
Poster 79.6
Person 76.7
Person 73.2
Painting 69
Art 69
Person 67
Person 66.3
Person 64.1
Person 63.3
Person 62.8
Person 59
Gown 56.3
Fashion 56.3
Person 52.1
Person 41.8

Clarifai
created on 2023-10-26

people 100
many 99.9
group 99.7
group together 98.9
adult 97.4
child 94.5
man 93.3
war 92.7
wear 92.2
administration 92.2
military 90.1
woman 88.2
leader 87.5
crowd 87.4
recreation 86.4
boy 81.9
several 77.6
music 76.6
soldier 76.2
uniform 76.1

Imagga
created on 2022-01-22

clothing 21.5
people 21.2
person 18.5
man 15.4
world 14.9
brassiere 13.2
dancer 13.1
bride 12.6
portrait 12.3
newspaper 12.2
male 11.3
happy 11.3
performer 11.3
business 10.9
black 10.8
woman's clothing 10.6
undergarment 10.6
covering 10.6
group 10.5
happiness 10.2
garment 10.1
dress 9.9
adult 9.9
consumer goods 9.7
design 9.7
couple 9.6
celebration 9.6
love 9.5
art 9.3
wedding 9.2
old 9.1
product 9
financial 8.9
bouquet 8.8
symbol 8.7
child 8.3
vintage 8.3
fun 8.2
sexy 8
creation 8
family 8
face 7.8
entertainer 7.8
attractive 7.7
married 7.7
grunge 7.7
money 7.7
decoration 7.6
room 7.6
bill 7.6
finance 7.6
fashion 7.5
retro 7.4
blackboard 7.4
cash 7.3
currency 7.2
team 7.2
smile 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.8
person 99.4
clothing 95.1
black 83.9
old 76.5
white 73.8
player 73.7
posing 72.8
man 72
team 61.9
woman 61
group 60.7
crowd 0.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 1-7
Gender Male, 98.8%
Calm 61.9%
Disgusted 9.7%
Angry 8.6%
Sad 8.4%
Happy 8%
Confused 1.9%
Fear 1%
Surprised 0.6%

AWS Rekognition

Age 51-59
Gender Male, 100%
Angry 71.7%
Disgusted 19.2%
Sad 3.2%
Surprised 2.1%
Calm 1.6%
Fear 0.8%
Confused 0.8%
Happy 0.6%

AWS Rekognition

Age 27-37
Gender Male, 100%
Calm 90.9%
Sad 3.6%
Angry 2.1%
Confused 1%
Disgusted 0.8%
Fear 0.7%
Surprised 0.5%
Happy 0.3%

AWS Rekognition

Age 30-40
Gender Male, 88.9%
Sad 58.2%
Calm 23.9%
Fear 9.1%
Happy 4.1%
Disgusted 1.8%
Confused 1.1%
Angry 1%
Surprised 0.9%

AWS Rekognition

Age 23-31
Gender Male, 73.8%
Calm 53.8%
Sad 34.7%
Angry 5.2%
Disgusted 3%
Surprised 1.3%
Confused 1%
Fear 0.5%
Happy 0.4%

AWS Rekognition

Age 16-24
Gender Male, 99.8%
Calm 89.3%
Confused 3.5%
Sad 2.9%
Happy 1.6%
Angry 1%
Disgusted 0.7%
Surprised 0.5%
Fear 0.4%

Feature analysis

Amazon

Person 99.3%
Poster 79.6%
Painting 69%

Categories

Text analysis

Amazon

MARTIN
PROOF
MARTIN SCHWEIG
SAINT
SAINT LOUIS
LOUIS
SCHWEIG

Google

PROOF MARTIN SCHWEIG SAINT LOUIS
PROOF
MARTIN
SCHWEIG
SAINT
LOUIS