Human Generated Data

Title

Untitled (group of men and women sitting at table under an umbrella near a pool)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5337

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of men and women sitting at table under an umbrella near a pool)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5337

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.7
Person 98.8
Person 98.8
Person 89.7
Clothing 88.5
Apparel 88.5
Person 87.9
Person 87
Meal 82
Food 82
People 80.4
Tablecloth 80.4
Furniture 72.3
Leisure Activities 67.5
Person 65.7
Outdoors 65.5
Person 64.5
Plant 64.1
Female 63.2
Suit 62.1
Coat 62.1
Overcoat 62.1
Text 60.8
Tent 59.5
Camping 58.3
Table 58
Picnic 57.1
Vacation 57.1
Dining Table 56.8
Person 48

Clarifai
created on 2023-10-26

people 99.4
man 98
umbrella 97.9
adult 96.1
group 95.9
group together 94.5
chair 92.9
woman 91.6
sit 89.7
many 85.8
child 84.9
furniture 83.7
several 83.4
sunshade 83.3
two 82.6
four 82.5
leader 81.6
recreation 81.2
family 78.7
wear 76.3

Imagga
created on 2022-01-22

snow 50.1
chair 28.7
beach 28
sand 27.6
travel 27.5
weather 27.2
sea 25.8
sky 24.2
landscape 23.8
patio 21.4
outdoor 20.6
structure 20.5
ocean 19.9
vacation 19.6
area 18.1
water 18
tourism 17.3
seat 17
summer 16.7
winter 16.2
cold 15.5
scene 14.7
island 14.6
coast 14.4
tropical 13.6
season 13.2
resort 13.2
shore 13.1
ice 13.1
wood 11.7
rural 11.5
scenic 11.4
sun 11.3
peaceful 11
transportation 10.8
car 10.7
frozen 10.5
outdoors 10.4
forest 10.4
paradise 10.4
cloud 10.3
empty 10.3
day 10.2
clouds 10.1
relax 10.1
relaxation 10
holiday 10
chairs 9.8
snowy 9.7
frost 9.6
people 9.5
vehicle 9.5
tourist 9.1
park 9.1
desert 9
table 9
sunny 8.6
canvas tent 8.5
folding chair 8.4
field 8.4
house 8.4
vintage 8.3
seaside 8.2
road 8.1
scenery 8.1
horizon 8.1
mountain 8
grass 7.9
furniture 7.9
solitude 7.7
tree 7.7
village 7.7
bay 7.5
mountains 7.4
man 7.4
peace 7.3
morning 7.2
soil 7.1
architecture 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 93.4
umbrella 91.5
beach 89.6
clothing 81.9
man 78.1
person 76.6
black and white 66.5
tent 64.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 60.2%
Calm 99%
Sad 0.4%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 20-28
Gender Male, 78.2%
Calm 95.1%
Happy 1.8%
Confused 1.1%
Sad 1.1%
Disgusted 0.3%
Fear 0.2%
Angry 0.2%
Surprised 0.2%

AWS Rekognition

Age 25-35
Gender Female, 79.1%
Calm 99.8%
Sad 0.1%
Surprised 0%
Happy 0%
Confused 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 31-41
Gender Female, 51.5%
Happy 46.8%
Calm 37.3%
Surprised 7.7%
Sad 2.6%
Confused 2.2%
Disgusted 1.6%
Angry 1.1%
Fear 0.6%

AWS Rekognition

Age 24-34
Gender Male, 71.6%
Calm 82.5%
Happy 16.2%
Disgusted 0.3%
Sad 0.3%
Fear 0.2%
Surprised 0.2%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 29-39
Gender Female, 84.1%
Calm 80.4%
Sad 13.7%
Happy 1.3%
Disgusted 1.3%
Fear 1%
Confused 1%
Angry 0.9%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.7%
Tent 59.5%

Categories

Imagga

paintings art 99.7%

Captions

Microsoft
created on 2022-01-22

calendar 5.8%

Text analysis

Amazon

17528

Google

17528. 17528. HAGON-YT37A2-AMTZA3
17528.
HAGON-YT37A2-AMTZA3