Human Generated Data

Title

Untitled (group of people seated on porch steps)

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10594

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people seated on porch steps)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10594

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.5
Person 99.5
Person 99.3
Handrail 99
Banister 99
Person 98.6
Person 97.9
Person 97.3
Person 96.2
Person 92
Porch 91.4
Clothing 89.3
Apparel 89.3
Housing 81.6
Building 81.6
Staircase 80.4
Railing 79.5
Shorts 79.4
Villa 75.7
House 75.7
Kid 73.1
Child 73.1
Dress 73.1
Plant 66.2
Girl 62.6
Female 62.6
People 61.9
Architecture 60.1
Door 58.6
Chair 55.7
Furniture 55.7
Yard 55.4
Nature 55.4
Outdoors 55.4

Clarifai
created on 2023-10-25

people 100
group 99.5
child 99.4
education 98.7
group together 98.7
man 98
many 97.7
school 97.5
adult 97.5
elementary school 95.6
boy 95.5
administration 95.4
woman 94.7
leader 94.2
several 93.9
step 93.6
five 89.2
teacher 88.5
home 88.4
sit 87.1

Imagga
created on 2022-01-09

people 20.1
building 19.7
city 18.3
silhouette 17.4
architecture 16.6
urban 16.6
window 16.3
business 15.2
office 14.8
man 14.4
room 14.2
interior 14.1
station 13.8
glass 12.7
classroom 12.4
light 12
black 12
travel 12
chair 11.6
structure 11.5
person 11.4
women 11.1
transportation 10.8
male 10.6
modern 10.5
facility 10.4
hall 10.2
house 10.1
walk 9.5
men 9.4
water 9.3
gymnasium 9.3
adult 9.3
reflection 9.2
transport 9.1
indoor 9.1
corridor 8.8
airport 8.8
crowd 8.6
motion 8.6
construction 8.5
wall 8.5
walking 8.5
floor 8.4
inside 8.3
tourism 8.2
group 8.1
life 8
case 8
businessman 7.9
shop 7.9
human 7.5
athletic facility 7.5
new 7.3
steel 7.2
activity 7.2
portrait 7.1
night 7.1
day 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 91.4
window 89
house 65.3
posing 57.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 77.1%
Calm 92.8%
Confused 3.6%
Sad 1.1%
Fear 1%
Surprised 0.6%
Happy 0.4%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 38-46
Gender Male, 99.4%
Calm 73.4%
Surprised 24.7%
Sad 0.5%
Happy 0.5%
Disgusted 0.3%
Angry 0.3%
Confused 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.7%

Categories

Text analysis

Amazon

11

Google

11
11