Human Generated Data

Title

Untitled (family standing on porch)

Date

1957

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8982

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family standing on porch)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Balcony 95.8
Human 95.6
Person 95.6
Person 95.3
Person 94.6
Person 87.8
Banister 81.1
Handrail 81.1
Person 79.6
Person 76.5
Railing 69.4
Door 61
Person 60.4

Imagga
created on 2022-01-09

balcony 82.2
window 44.4
architecture 40.3
building 31
structure 29.3
house 29.2
home 24.7
wall 21.5
interior 21.2
modern 18.2
office 18.1
city 17.4
design 16.9
glass 16.3
windows 16.3
residential 15.3
dishwasher 14.9
urban 14.8
screen 14.8
sky 14.7
window screen 14.5
construction 14.5
exterior 13.8
light 13.4
estate 12.3
white goods 12.1
old 11.8
residence 11.7
roof 11.4
real 11.4
protective covering 11.3
metal 11.3
style 11.1
door 10.9
frame 10.8
new 10.5
home appliance 10.1
indoor 10
facade 9.9
travel 9.9
room 9.6
high 9.5
framework 9.3
appliance 9
railing 8.8
indoors 8.8
property 8.7
architectural 8.6
luxury 8.6
iron 8.4
wood 8.3
inside 8.3
monitor 8.2
windowsill 8
steel 7.9
business 7.9
summer 7.7
apartment 7.7
outdoor 7.6
stone 7.6
vacation 7.4
decoration 7.2
covering 7.2
landmark 7.2
transportation 7.2
holiday 7.2
decor 7.1

Google
created on 2022-01-09

Building 90.6
Rectangle 88.1
Font 77.4
Facade 77
Monochrome photography 74.4
Monochrome 73.9
Art 72.9
Metal 67.7
Room 67.2
Glass 60.3
Visual arts 59.5
Pattern 56.9
Window covering 55.8
Symmetry 55.3
Handrail 54.8
Illustration 51.9
Roof 51.9
Daylighting 51.7

Microsoft
created on 2022-01-09

black and white 90.1
window 85.1
text 78.4
person 78.2
building 73.4
clothing 52.8

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Male, 91.6%
Calm 77.2%
Sad 14.4%
Confused 3.3%
Happy 1.8%
Surprised 1.5%
Angry 0.9%
Disgusted 0.7%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Male, 96.3%
Calm 93.5%
Disgusted 1.7%
Sad 1.5%
Happy 1.1%
Angry 0.8%
Surprised 0.8%
Fear 0.3%
Confused 0.3%

AWS Rekognition

Age 39-47
Gender Female, 62.5%
Calm 99.6%
Surprised 0.1%
Sad 0.1%
Confused 0.1%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 52-60
Gender Female, 67.5%
Calm 98%
Happy 1.1%
Sad 0.4%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0%
Angry 0%

AWS Rekognition

Age 23-33
Gender Female, 60.5%
Sad 51.2%
Calm 34.2%
Confused 7.8%
Fear 4.5%
Happy 0.9%
Surprised 0.8%
Disgusted 0.4%
Angry 0.3%

AWS Rekognition

Age 16-24
Gender Male, 99.7%
Calm 96.9%
Sad 1.4%
Angry 0.6%
Disgusted 0.4%
Surprised 0.3%
Happy 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 93.2%
Calm 86%
Sad 8.9%
Happy 1.9%
Disgusted 1.6%
Confused 0.9%
Surprised 0.4%
Angry 0.2%
Fear 0.1%

Feature analysis

Amazon

Person 95.6%

Captions

Microsoft

a group of people standing in front of a window 45.6%
a group of people standing next to a window 45.5%
a group of people in front of a window 45.4%

Text analysis

Amazon

42470.

Google

70
2
.
너2너 70.