Human Generated Data

Title

Untitled (man and nurse)

Date

c. 1930

People

Artist: Lewis Wickes Hine, American 1874 - 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.806

Human Generated Data

Title

Untitled (man and nurse)

People

Artist: Lewis Wickes Hine, American 1874 - 1940

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.806

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 98.8
Person 98.8
Person 98.1
Electronics 86
Screen 86
Monitor 84.6
Display 84.6
Canvas 78.9
Clothing 76.7
Apparel 76.7
Face 75.2
Outdoors 64.3
Photography 61.5
Photo 61.5
Advertisement 60.8
LCD Screen 59.8
Collage 57
Poster 57
Shorts 56.8
Kid 56.3
Child 56.3
Furniture 55.1

Clarifai
created on 2023-10-25

people 99.5
guitar 98.8
two 98.7
child 98
portrait 97
boy 96.8
monochrome 96.6
son 96.3
collage 95.7
man 95.4
girl 95.2
art 94.6
family 93.7
wear 93.5
baby 91.2
couple 89.8
adult 89.6
recreation 89.5
music 89.1
documentary 89

Imagga
created on 2021-12-14

laptop 41.9
computer 37.1
person 34.6
people 27.9
work 26.2
business 26.1
television 26
adult 25.3
technology 24.5
notebook 22.8
man 21.5
sitting 20.6
working 20.3
office 19.3
happy 18.2
telecommunication system 17.3
chair 16.8
mountain tent 16.6
male 16.3
tent 16.2
outdoors 15.6
businesswoman 15.4
professional 15.3
lifestyle 15.2
communication 15.1
attractive 14.7
smiling 14.5
job 14.1
success 13.7
worker 13.4
steel drum 13
table 13
portrait 12.9
executive 12.9
corporate 12.9
back 12.8
casual 12.7
businessman 12.4
desk 12.3
outdoor 12.2
percussion instrument 12.1
education 12.1
successful 11.9
using 11.5
structure 11.4
smile 11.4
pretty 11.2
women 11.1
folding chair 11
scholar 10.7
one 10.4
sit 10.4
manager 10.2
day 10.2
shelter 9.9
monitor 9.6
musical instrument 9.6
wireless 9.5
men 9.4
face 9.2
modern 9.1
student 9.1
active 9
looking 8.8
indoors 8.8
typing 8.8
couple 8.7
happiness 8.6
intellectual 8.6
relax 8.4
billboard 8.4
alone 8.2
teenager 8.2
cheerful 8.1
suit 8.1
seat 7.9
hair 7.9
wheeled vehicle 7.9
love 7.9
equipment 7.9
color 7.8
model 7.8
summer 7.7
hand 7.6
horizontal 7.5
joy 7.5
keyboard 7.5
park 7.4
room 7.3
confident 7.3
sky 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wall 96.8
person 95
man 94.4
gallery 92
clothing 86.1
text 83.2
black and white 75.4
tree 73.9
old 50.6
room 50.3
picture frame 44.6
painting 18.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-49
Gender Female, 67%
Calm 95.9%
Sad 1.6%
Confused 1.1%
Angry 0.7%
Surprised 0.5%
Fear 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 36-54
Gender Male, 97.4%
Calm 72.8%
Sad 22%
Confused 1.7%
Angry 1.5%
Surprised 0.7%
Disgusted 0.7%
Fear 0.4%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Categories

Imagga

paintings art 96.8%
pets animals 3.2%

Captions

Microsoft
created on 2021-12-14

a painting of a man 88.3%
an old photo of a man 85.4%
a painting of a man in a room 85.3%