Human Generated Data

Title

Untitled (man and woman near small man-made pond)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7565

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman near small man-made pond)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7565

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.5
Human 99.5
Person 95
Person 76.6
Apparel 72.6
Clothing 72.6
Building 68
Face 64.8
People 61.2

Clarifai
created on 2023-10-25

people 99.9
adult 97.9
one 96.4
furniture 95.9
man 95.1
wear 95.1
home 93.7
two 92.9
monochrome 91.4
woman 88.5
chair 87.6
vehicle 86.8
seat 86.8
sit 86
street 85.3
leader 84.6
administration 82.8
military 80.1
group 80
print 79.2

Imagga
created on 2022-01-08

chair 46.8
seat 36.7
furniture 29.5
rocking chair 26.1
architecture 23.5
building 21.7
window 20.8
house 20.1
interior 18.6
snow 17.7
modern 17.5
structure 16.5
wood 15.8
inside 15.6
room 15.5
home 15.2
city 15
wall 14.7
light 14
urban 14
shop 13.5
travel 13.4
luxury 12.9
old 12.5
barbershop 12.2
patio 11.9
stone 11.2
door 11.1
barrier 11.1
street 11
tourism 10.7
lamp 10.5
bench 10.4
scene 10.4
construction 10.3
decor 9.7
table 9.7
balcony 9.7
barber chair 9.7
indoors 9.7
metal 9.7
style 9.6
windows 9.6
apartment 9.6
residential 9.6
brick 9.4
glass 9.3
floor 9.3
outdoor 9.2
vehicle 9.1
design 9
outdoors 9
sky 8.9
trees 8.9
chairs 8.8
garden 8.7
support 8.6
furnishing 8.6
empty 8.6
season 8.6
living 8.5
3d 8.5
winter 8.5
relaxation 8.4
barrow 8.2
wheeled vehicle 8.2
weather 8
to 8
exit 7.9
mercantile establishment 7.8
hotel 7.6
relax 7.6
device 7.5
handcart 7.4
landscape 7.4
town 7.4
indoor 7.3
business 7.3
transportation 7.2
bathroom 7.1
wooden 7
hospital 7

Microsoft
created on 2022-01-08

black and white 89.6
text 87.2
building 84.9
window 84.4
furniture 22.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Female, 97.8%
Calm 99.6%
Sad 0.2%
Surprised 0.1%
Happy 0%
Disgusted 0%
Confused 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Text analysis

Amazon

YACCY
the

Google

inu
inu