Human Generated Data

Title

Untitled (overhead view of wedding party and guests sitting in square table arrangement with plants in center)

Date

1942

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10154

Human Generated Data

Title

Untitled (overhead view of wedding party and guests sitting in square table arrangement with plants in center)

People

Artist: Martin Schweig, American 20th century

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10154

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.3
Human 99.3
Furniture 97.1
Plant 95.4
Table 87.7
Home Decor 85.5
Dining Table 84.8
Person 84.7
Person 83.1
Person 81.9
Indoors 81
Tree 80.9
Tabletop 79
Room 78
Person 77.6
Flower Arrangement 77.5
Flower 77.5
Blossom 77.5
Reception 73.5
Meal 72.6
Food 72.6
Linen 70.7
Painting 68.4
Art 68.4
Person 67.9
Crowd 66.7
Flower Bouquet 59.7
Restaurant 59.6
Tablecloth 58.7
Waiting Room 58.6
Reception Room 58.6
Flooring 58.1
Hall 56.2
Couch 55.9
Person 54.3

Clarifai
created on 2023-10-26

people 99.7
many 99.2
group 99.2
monochrome 98.9
sepia 97.5
street 94
man 92.6
group together 92
adult 91.3
vintage 91.1
war 91.1
no person 88.9
soldier 88.5
retro 87.9
art 87.7
administration 87.1
old 86.8
child 86.6
military 86.3
woman 86.1

Imagga
created on 2022-01-22

building 23.6
structure 23.1
architecture 22.7
old 22.3
grunge 22.1
jigsaw puzzle 21.1
art 19.9
texture 17.4
design 16.9
puzzle 16.7
travel 16.2
retro 15.6
decoration 14.8
city 14.1
vintage 14.1
tourism 13.2
flowers 13
pattern 13
antique 12.6
plant 12.6
rough 11.8
graphic 11.7
tree 11.6
game 11.4
snow 11.4
frame 11.3
floral 11.1
paint 10.9
flower 10.8
surface 10.6
urban 10.5
landscape 10.4
house 10.3
summer 10.3
holiday 10
water 10
aged 10
dirty 9.9
landmark 9.9
modern 9.8
grungy 9.5
season 9.3
space 9.3
town 9.3
church 9.2
wall 9.2
material 8.9
style 8.9
messy 8.7
paper 8.7
ancient 8.6
buildings 8.5
drawing 8.5
plants 8.3
traditional 8.3
border 8.1
digital 8.1
detail 8
light 8
black 7.8
scene 7.8
wallpaper 7.7
blackboard 7.7
cityscape 7.6
memorial 7.5
exterior 7.4
vacation 7.4
brown 7.4
weather 7.3
tourist 7.2
religion 7.2
history 7.2
chandelier 7.1
trees 7.1

Google
created on 2022-01-22

Photograph 94.2
Tree 79.2
Snapshot 74.3
Plant 72.8
Event 71
Room 67.5
Art 67.4
Font 67.2
History 66.6
Stock photography 65.4
Rectangle 62.5
Paper product 59.2
Urban design 58.5
Visual arts 55.8
Photo caption 55.7
Crowd 55.6
Pole 53.9
Funeral 51.8

Microsoft
created on 2022-01-22

text 99.9
funeral 91.1
flower 77.3
grave 73.9
cemetery 71.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 99.9%
Angry 0%
Sad 0%
Fear 0%
Surprised 0%
Disgusted 0%
Happy 0%
Confused 0%

AWS Rekognition

Age 20-28
Gender Female, 99.7%
Calm 81.2%
Happy 10.8%
Disgusted 5.6%
Angry 0.7%
Surprised 0.6%
Confused 0.4%
Sad 0.4%
Fear 0.3%

AWS Rekognition

Age 6-16
Gender Male, 99.7%
Calm 87.3%
Angry 7.9%
Fear 1.5%
Confused 0.9%
Sad 0.8%
Disgusted 0.6%
Happy 0.6%
Surprised 0.3%

AWS Rekognition

Age 14-22
Gender Male, 71.3%
Calm 48.6%
Angry 24.3%
Sad 13.1%
Surprised 5.3%
Disgusted 4.6%
Confused 2.3%
Fear 1.1%
Happy 0.7%

AWS Rekognition

Age 14-22
Gender Male, 99.7%
Calm 98.9%
Sad 0.4%
Fear 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 28-38
Gender Male, 100%
Calm 82.6%
Confused 6.7%
Angry 3.2%
Sad 2%
Disgusted 1.8%
Surprised 1.3%
Happy 1.2%
Fear 1.1%

AWS Rekognition

Age 40-48
Gender Male, 99.1%
Sad 91.3%
Disgusted 2.3%
Calm 2.2%
Confused 1.2%
Fear 1%
Angry 0.8%
Happy 0.7%
Surprised 0.5%

AWS Rekognition

Age 13-21
Gender Female, 86.2%
Sad 38.4%
Disgusted 33.4%
Calm 18.2%
Angry 6.2%
Fear 2.1%
Confused 0.9%
Happy 0.5%
Surprised 0.3%

AWS Rekognition

Age 10-18
Gender Male, 99.1%
Calm 69.9%
Sad 10%
Fear 8.4%
Angry 4%
Happy 3.5%
Disgusted 1.5%
Surprised 1.4%
Confused 1.3%

AWS Rekognition

Age 23-31
Gender Male, 92.8%
Calm 49.3%
Happy 16.8%
Surprised 11.8%
Disgusted 9.4%
Fear 6.3%
Confused 2.5%
Angry 2.2%
Sad 1.7%

AWS Rekognition

Age 16-22
Gender Male, 75.5%
Calm 82.4%
Happy 7.2%
Surprised 6.3%
Fear 2.6%
Disgusted 0.4%
Angry 0.4%
Sad 0.4%
Confused 0.3%

AWS Rekognition

Age 14-22
Gender Female, 80.3%
Calm 99.8%
Happy 0.1%
Sad 0%
Fear 0%
Disgusted 0%
Angry 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 13-21
Gender Female, 68.1%
Calm 98.1%
Sad 0.6%
Angry 0.5%
Happy 0.3%
Disgusted 0.2%
Fear 0.2%
Confused 0.2%
Surprised 0.1%

AWS Rekognition

Age 6-16
Gender Male, 93.8%
Angry 48.3%
Calm 33.9%
Sad 8.9%
Disgusted 3%
Happy 2.6%
Surprised 1.2%
Fear 1.1%
Confused 1%

Feature analysis

Amazon

Person 99.3%
Painting 68.4%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2022-01-22

an old photo of a person 60%
an old photo of a cake 39%
old photo of a person 38.9%

Text analysis

Amazon

PROOF
SAINT
SAINT LOUIS
LOUIS
MARTIN
MARTIN SCHWEIG
SCHWEIG

Google

PROOF MARTIN SCHWEIG SAINT LOUIS
PROOF
MARTIN
SCHWEIG
SAINT
LOUIS