Human Generated Data

Title

Untitled (three women sitting under awning of canopy)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7581

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three women sitting under awning of canopy)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7581

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Person 97.8
Chair 97.6
Furniture 97.6
Nature 92.5
Person 89.4
Outdoors 87.4
Meal 85.2
Food 85.2
Shelter 82.7
Building 82.7
Rural 82.7
Countryside 82.7
Yard 81.6
Tent 79.9
Restaurant 78.9
Cafe 78.4
Clothing 71.4
Apparel 71.4
Patio 70.8
Table 70.1
Flooring 67.2
People 65.9
Cafeteria 62.1
Dining Table 59.8
Porch 56.9
Female 55

Clarifai
created on 2023-10-25

tent 99.6
people 99.6
furniture 97.2
monochrome 96.6
chair 95.9
group 94.1
street 93.2
man 93.1
family 91.9
group together 91.7
adult 91.6
campsite 91.4
music 89.4
camp 89.2
table 87.5
art 83.5
war 82
home 81
many 80.4
woman 80

Imagga
created on 2022-01-08

piano 45.4
grand piano 43.2
percussion instrument 39.4
keyboard instrument 35.9
stringed instrument 35
musical instrument 30.4
stall 27.1
chair 26.1
building 24.4
restaurant 18.5
sun 18.5
silhouette 18.2
sky 16.6
structure 15.9
upright 13.8
sunset 13.5
house 13.4
patio 12.8
chairs 12.7
architecture 12.6
table 12.5
city 12.5
shopping cart 12.5
people 12.3
urban 12.2
seat 12.2
business 12.1
modern 11.9
landscape 11.2
furniture 10.9
transportation 10.8
man 10.7
umbrella 10.7
outdoor 10.7
travel 10.6
summer 10.3
industry 10.2
relax 10.1
water 10
handcart 9.7
equipment 9.4
sunrise 9.4
light 9.4
beach 9.3
island 9.2
container 8.9
interior 8.8
gas 8.7
sea 8.6
glass 8.6
window 8.5
wheeled vehicle 8.4
street 8.3
vacation 8.2
home 8
day 7.8
pump 7.8
empty 7.7
tree 7.7
clouds 7.6
wood 7.5
evening 7.5
inside 7.4
person 7.3
transport 7.3
black 7.2
work 7.2
sunlight 7.1
grass 7.1
night 7.1
steel 7.1
working 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.3
outdoor 96.1
black and white 96
furniture 88
table 82.1
house 80.4
black 79.8
white 78.8
chair 71.1
monochrome 62.2
old 40.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 92.5%
Calm 97.7%
Sad 0.6%
Confused 0.6%
Surprised 0.5%
Angry 0.2%
Disgusted 0.2%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 23-33
Gender Male, 83.8%
Calm 81.1%
Happy 15.6%
Sad 1.4%
Confused 0.6%
Surprised 0.5%
Angry 0.3%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 88.6%
Calm 99.1%
Sad 0.5%
Happy 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Tent 79.9%

Text analysis

Amazon

39853-A
SAA

Google

39853-A の
39853-A