Human Generated Data

Title

Untitled (audience watching bride and groom during wedding ceremony)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8741

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (audience watching bride and groom during wedding ceremony)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8741

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Person 99.3
Yard 99.3
Outdoors 99.3
Nature 99.3
Person 98.7
Vegetation 98.2
Plant 98.2
Person 98.2
Person 97.9
Person 97.6
Person 97.5
Person 96.9
Clothing 96.6
Apparel 96.6
Grove 95.6
Woodland 95.6
Land 95.6
Tree 95.6
Forest 95.6
Dress 90.8
Furniture 90.3
Grass 88.7
Person 87.8
Person 80.6
Suit 80.1
Overcoat 80.1
Coat 80.1
Female 79.2
Person 76.9
Person 75.6
Park 75.2
Lawn 75.2
Crowd 75
People 73.5
Chair 72.3
Person 71.2
Meal 69.3
Food 69.3
Face 68.5
Girl 65.5
Robe 63.6
Fashion 63.6
Woman 62.9
Gown 62.5
Leisure Activities 62.4
Backyard 60.8
Wedding 60.1
Person 59.7
Field 59.7
Standing 59.6
Wedding Gown 55

Clarifai
created on 2023-10-25

people 99.5
man 96.2
group 95
adult 94.7
woman 94.6
monochrome 91.3
many 86.5
child 86
ceremony 84.8
light 84.6
art 84.3
crowd 81.3
leader 80.1
veil 79.5
group together 79.3
wedding 79.2
street 78.6
religion 77.6
wall 77.4
wear 76.3

Imagga
created on 2022-01-09

gravestone 96.4
memorial 77.9
cemetery 76.4
stone 62.4
structure 43.5
tree 26.9
landscape 23.8
old 23.7
park 22.2
trees 21.3
forest 20.9
winter 20.4
snow 20
architecture 18.8
building 17.8
city 17.5
season 15.6
sky 15.3
travel 14.8
light 14
cold 13.8
black 13.2
road 12.6
religion 12.6
texture 12.5
frozen 12.4
tourism 12.4
outdoor 12.2
antique 12.1
church 12
grunge 11.9
fall 11.8
art 11.7
color 11.7
history 11.6
autumn 11.4
grungy 11.4
path 11.3
weather 11.3
ice 11.1
wood 10.8
scenery 10.8
night 10.7
rural 10.6
frost 10.6
scene 10.4
ancient 10.4
street 10.1
landmark 9.9
vintage 9.9
fog 9.7
tunnel 9.6
leaf 9.3
scenic 8.8
dark 8.4
morning 8.1
sun 8.1
natural 8
cool 8
day 7.8
snowy 7.8
mist 7.7
summer 7.7
wallpaper 7.7
woods 7.6
old fashioned 7.6
pattern 7.5
outdoors 7.5
retro 7.4
water 7.3
paint 7.2
fountain 7.2
seasonal 7
leaves 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

tree 98.9
grave 95.6
cemetery 93
black and white 85.3
text 85.2
funeral 77.2
person 73

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 64.8%
Happy 97.2%
Calm 1.1%
Fear 0.8%
Sad 0.4%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 24-34
Gender Female, 55.3%
Angry 84.8%
Surprised 4.6%
Calm 4.1%
Fear 2.5%
Sad 1.6%
Confused 1%
Disgusted 0.8%
Happy 0.8%

AWS Rekognition

Age 21-29
Gender Male, 86.3%
Calm 95.3%
Sad 4.2%
Happy 0.3%
Disgusted 0.1%
Surprised 0.1%
Angry 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.5%
Chair 72.3%

Categories

Text analysis

Amazon

38567
د8
KODAK

Google

58 YT37A°2- AO
58
YT37A°2-
AO