Human Generated Data

Title

Untitled (couple standing near tomb at Manasota Burial Park, Florida)

Date

1956

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11647

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple standing near tomb at Manasota Burial Park, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11647

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 99.2
Shop 94.4
Nature 83.2
Person 75.3
Person 65.2
Urban 56.2

Clarifai
created on 2023-10-25

people 99.8
adult 96
home 94.4
group 94.3
man 93.4
indoors 93.2
woman 93.1
leader 92.3
many 92.2
family 91.2
group together 91
two 90.3
child 90.1
room 88.8
several 87.9
furniture 87.8
monochrome 85
three 84.7
administration 79.3
chair 78.3

Imagga
created on 2022-01-15

architecture 27.9
city 25.8
building 25.4
structure 23.7
sky 22.3
house 16.7
urban 14.9
travel 14.8
street 14.7
old 14.6
industry 14.5
office 14.1
construction 13.7
exterior 12.9
water 12.7
stall 10.9
power 10.9
landmark 10.8
night 10.7
landscape 10.4
business 10.3
industrial 10
transportation 9.9
tower 9.8
equipment 9.8
new 9.7
skyline 9.5
buildings 9.5
sea 9.4
transport 9.1
ocean 9.1
modern 9.1
facade 9
bridge 8.7
houses 8.7
light 8.7
downtown 8.6
cityscape 8.5
energy 8.4
place 8.4
town 8.3
frame 8.3
vintage 8.3
billboard 8.2
river 8
steel 8
high 7.8
factory 7.7
fuel 7.7
panorama 7.6
clouds 7.6
station 7.5
outdoors 7.5
plant 7.5
smoke 7.4
tourism 7.4
environment 7.4
metal 7.2
road 7.2
black 7.2
home 7.2
negative 7.1
container 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.6
wedding dress 88.3
bride 78
house 77.1
clothing 76.8
black and white 73.6
person 73.4
black 71.3
white 69.9
woman 61.8
dress 54.9
flower 52.5
posing 51.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 79.9%
Happy 81.3%
Confused 12.4%
Calm 5.3%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%
Sad 0.2%
Angry 0.1%

AWS Rekognition

Age 35-43
Gender Female, 64.8%
Calm 47.8%
Happy 20.2%
Fear 16.5%
Surprised 7.1%
Angry 3.2%
Disgusted 2.8%
Sad 1.6%
Confused 1%

AWS Rekognition

Age 23-31
Gender Male, 97.4%
Calm 93.4%
Fear 1.9%
Sad 1.9%
Surprised 1.2%
Happy 0.7%
Angry 0.3%
Confused 0.3%
Disgusted 0.3%

AWS Rekognition

Age 9-17
Gender Female, 50.5%
Calm 75.6%
Sad 21.3%
Angry 1.1%
Fear 0.5%
Happy 0.5%
Disgusted 0.4%
Confused 0.4%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

interior objects 99.4%

Text analysis

Amazon

44489