Human Generated Data

Title

Untitled (woman seated on man's lap at outdoor table)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5327

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated on man's lap at outdoor table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5327

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 100
Clothing 99.7
Apparel 99.7
Person 98.9
Human 98.9
Person 98.8
Person 97.4
Chair 96.9
Person 96.9
Person 95.5
Female 95.5
Shorts 95.2
Person 92.1
Face 88
Woman 86.5
Chair 84
Hat 83.3
Dress 81.1
Tie 80.5
Accessories 80.5
Accessory 80.5
Chair 79.9
Text 79.7
People 77.9
Dining Table 75.1
Table 75.1
Suit 74.3
Coat 74.3
Overcoat 74.3
Girl 71.7
Meal 71
Food 71
Photography 69.3
Photo 69.3
Portrait 68.9
Smile 66.8
Sunglasses 64.8
Person 62.8
Person 61.9
Plant 59.7
Kid 59.5
Child 59.5
Sailor Suit 56.9
Man 56.2
Outdoors 56.2
Collage 56.1
Advertisement 56.1
Poster 56.1
Shirt 55.3
Glasses 55.3

Clarifai
created on 2023-10-26

people 99.9
adult 98.7
group 98.2
group together 97.5
man 96.8
chair 96.7
woman 95.2
furniture 94.6
monochrome 94.1
wear 92.3
administration 92.1
leader 91.9
sit 90.5
several 89.1
many 87.5
two 86.5
recreation 84.6
four 80.8
military 80.7
seat 78.6

Imagga
created on 2022-01-22

chair 83.9
seat 43.2
folding chair 31.9
table 27.9
furniture 27.5
chairs 23.5
person 19.7
musical instrument 19.3
people 18.9
sitting 18.9
summer 18.6
beach 17
man 16.8
restaurant 16.8
outdoors 14.9
vacation 14.7
newspaper 14.6
business 14
room 13.8
outside 13.7
interior 13.3
outdoor 13
product 12.9
relax 12.6
relaxation 12.6
computer 12
empty 12
cafeteria 11.7
leisure 11.6
lifestyle 11.6
tropical 11.1
laptop 11.1
day 11
tables 10.8
steel drum 10.8
work 10.8
male 10.6
building 10.6
urban 10.5
sit 10.4
resort 10.3
luxury 10.3
percussion instrument 10.1
adult 10.1
wood 10
water 10
modern 9.8
sun 9.7
technology 9.6
patio 9.5
glass 9.3
holiday 9.3
floor 9.3
creation 9.2
sky 8.9
working 8.8
office 8.8
bowed stringed instrument 8.8
couple 8.7
grass 8.7
sea 8.6
cello 8.5
design 8.4
house 8.3
wind instrument 8.3
coffee 8.3
city 8.3
inside 8.3
tourism 8.2
tranquil 8.1
support 8.1
group 8.1
businessman 7.9
furnishing 7.9
device 7.9
planner 7.8
architecture 7.8
education 7.8
sunny 7.7
travel 7.7
comfortable 7.6
stringed instrument 7.6
paradise 7.5
drawing 7.5
human 7.5
ocean 7.5
style 7.4
relaxing 7.3
student 7.2
coast 7.2
sand 7.1
indoors 7

Google
created on 2022-01-22

Chair 86.3
Black-and-white 83.8
Adaptation 79.3
Vintage clothing 74.9
Snapshot 74.3
Monochrome 73.9
Art 73.9
Monochrome photography 73.7
Hat 73.4
Font 71.6
Suit 69.6
Event 67.6
Room 66.7
Sitting 64.8
History 64.6
Stock photography 63.7
Classic 62.1
Boot 61.5
Pattern 58.4
Retro style 57

Microsoft
created on 2022-01-22

text 98.9
clothing 92
person 89.4
man 75.9
footwear 64.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 65.4%
Happy 54.4%
Sad 23.3%
Calm 8.9%
Angry 6.8%
Confused 2.1%
Surprised 1.8%
Fear 1.4%
Disgusted 1.3%

AWS Rekognition

Age 23-33
Gender Female, 88.8%
Calm 83.2%
Confused 13.2%
Surprised 1.8%
Sad 0.8%
Angry 0.4%
Disgusted 0.3%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 48-56
Gender Male, 92.4%
Calm 99.3%
Happy 0.2%
Confused 0.2%
Sad 0.1%
Surprised 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 99.3%
Happy 94.9%
Calm 2.4%
Surprised 1.3%
Confused 0.7%
Sad 0.3%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 47-53
Gender Male, 73.7%
Calm 58.5%
Surprised 13.2%
Happy 9.2%
Sad 6.1%
Angry 4.4%
Confused 3.4%
Disgusted 2.8%
Fear 2.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Chair 96.9%
Hat 83.3%
Tie 80.5%
Sunglasses 64.8%

Categories

Imagga

paintings art 99.6%

Text analysis

Amazon

17547.
19549.

Google

17547.
17547.