Human Generated Data

Title

Untitled (couple standing next to chapel, Manasota Florida)

Date

1956

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4606

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple standing next to chapel, Manasota Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.3
Human 99.3
Person 98.8
Home Decor 89.1
Stage 70.7
Walkway 56.6
Path 56.6
Person 47.2

Imagga
created on 2022-02-05

building 41.1
architecture 38.8
house 30.4
wall 23.2
interior 22.1
home 21.5
old 20.9
structure 19.1
room 18.3
shelf 18
fireplace 16.7
exterior 16.6
window 16.3
floor 15.8
hall 15.7
travel 15.5
construction 15.4
door 15
city 15
facade 13.2
residence 13
design 12.9
stone 12.6
wood 12.5
lamp 12.4
estate 12.3
urban 12.2
modern 11.9
light 11.4
roof 11
brick 10.7
night 10.7
school 10.6
new 10.5
museum 10.5
living 10.4
stage 10.4
luxury 10.3
sky 10.2
palace 10
tourism 9.9
history 9.8
furniture 9.8
style 9.6
apartment 9.6
residential 9.6
real 9.5
contemporary 9.4
3d 9.3
street 9.2
facility 9.1
houses 8.7
entrance 8.7
scene 8.7
ancient 8.6
platform 8.4
town 8.3
traditional 8.3
university 8
tile 7.9
doors 7.9
entry 7.8
empty 7.7
culture 7.7
depository 7.6
landscape 7.4
church 7.4
vacation 7.4
inside 7.4
decoration 7.3
historic 7.3
area 7.1
rural 7
balcony 7
wooden 7
arch 7

Google
created on 2022-02-05

Plant 91.8
Building 90.1
Window 87.5
Door 83.8
Adaptation 79.3
Facade 78
Rectangle 77.8
Tints and shades 77.4
Beauty 74.9
Landscape 74.4
Snapshot 74.3
Event 69.7
Sky 69.2
Visual arts 65.7
Flower 64.5
Home 63.7
Room 63.5
Magenta 63
Wood 61
Brick 56.4

Microsoft
created on 2022-02-05

text 92.7
outdoor 86.2
building 80.6
house 79.7
sky 73.5

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 98.1%
Disgusted 46%
Sad 19.8%
Happy 14.4%
Calm 10.9%
Fear 3.7%
Angry 2.6%
Surprised 1.5%
Confused 1.1%

AWS Rekognition

Age 48-54
Gender Male, 75.2%
Happy 57.5%
Calm 17.1%
Sad 13.1%
Fear 4.8%
Disgusted 3.8%
Angry 1.8%
Surprised 1.2%
Confused 0.7%

AWS Rekognition

Age 40-48
Gender Female, 52.2%
Calm 93.4%
Sad 2.1%
Happy 1.3%
Confused 1%
Disgusted 0.8%
Angry 0.6%
Fear 0.5%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a man standing in front of a building 87.7%
a man that is standing in front of a building 84.2%
a man standing in front of a brick building 79.2%

Text analysis

Amazon

АТОГАЯА2
АТОГАЯА2 STEMMIETS
STEMMIETS