Human Generated Data

Title

Untitled (bride, groom and wedding guests on lawn)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8746

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (bride, groom and wedding guests on lawn)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.9
Human 98.9
Person 96
Person 95.7
Person 95.4
Person 95
Person 91.4
Person 91.1
Person 90.4
Person 89.9
Person 88.6
Person 88.5
Person 87.4
Person 87.3
Person 86.7
Plant 84.8
Person 82.9
Person 79
Clothing 78
Apparel 78
Person 76.6
Road 76
Pedestrian 75.7
Outdoors 72.5
Flower 69.9
Petal 69.9
Blossom 69.9
Nature 69.1
Tree 64.7
Person 63.7
Art 62.2
Coat 60.6
Overcoat 60.6
Grass 59.7
City 59.4
Town 59.4
Building 59.4
Street 59.4
Urban 59.4
Asphalt 58.5
Tarmac 58.5
Crowd 57.9
Suit 57.3
Person 55.9
Path 55.4

Imagga
created on 2022-01-09

snow 32.1
winter 19.6
old 18.8
city 18.3
wall 17.5
weather 16.6
landscape 16.4
black 15.6
texture 15.3
building 15
tree 14.8
grunge 14.5
forest 13
cold 12.9
structure 12.8
light 12.7
water 12.7
travel 12.7
ice 12.1
scene 12.1
pattern 11.6
park 11.5
color 11.1
picket fence 11.1
art 11.1
vintage 10.7
stone 10.4
architecture 10.3
fence 10.3
device 10
rough 10
dirty 9.9
surface 9.7
textured 9.6
design 9.6
day 9.4
season 9.3
outdoors 9
material 9
crystal 9
sky 8.9
urban 8.7
frost 8.6
grungy 8.5
house 8.3
wood 8.3
paint 8.1
barrier 8.1
wet 8
rural 7.9
space 7.8
fountain 7.7
outdoor 7.6
woods 7.6
frozen 7.6
town 7.4
exterior 7.4
street 7.4
aged 7.2
border 7.2
river 7.1
trees 7.1
cool 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

black and white 96.7
person 87.9
text 87.5
clothing 72.4
funeral 71.4
monochrome 69.2
man 68
grave 56
street 53.6
people 52.4

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Female, 63.3%
Angry 32.1%
Sad 21.6%
Calm 15%
Happy 10.9%
Confused 6.5%
Surprised 6.1%
Disgusted 5.4%
Fear 2.3%

AWS Rekognition

Age 33-41
Gender Male, 94%
Calm 40.7%
Happy 35%
Disgusted 11%
Sad 5.4%
Angry 3.9%
Confused 1.7%
Surprised 1.2%
Fear 1%

AWS Rekognition

Age 19-27
Gender Female, 52.7%
Calm 92.4%
Surprised 3.3%
Fear 2.1%
Happy 1.1%
Sad 0.4%
Angry 0.3%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.8%
Calm 36.4%
Happy 33%
Fear 15.1%
Sad 10.3%
Angry 1.8%
Disgusted 1.5%
Surprised 1%
Confused 0.9%

AWS Rekognition

Age 23-31
Gender Male, 98.8%
Calm 94.7%
Disgusted 1.8%
Sad 1.5%
Happy 0.8%
Confused 0.7%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 28-38
Gender Female, 56.4%
Happy 58.3%
Calm 28.6%
Confused 3.9%
Disgusted 2.5%
Surprised 2.2%
Sad 2.1%
Angry 1.3%
Fear 1%

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a group of people standing in front of a window 79.4%
a group of people in front of a window 79%
a group of people standing next to a window 78.9%

Text analysis

Amazon

38596
٢8
YJA-NAX

Google

58 38596
38596
58