Human Generated Data

Title

Untitled (seated woman opening gift next to two standing women)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4648

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated woman opening gift next to two standing women)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4648

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Person 99.4
Person 94.9
Person 92
Clothing 88
Apparel 88
Chair 70
Furniture 70
People 66.3
Female 66.1
Building 65.9
Shorts 63.2
Person 63.2
Girl 59.8

Clarifai
created on 2023-10-15

people 99.2
monochrome 98.3
chair 97
sit 95.6
man 95.6
adult 95
sitting 92.3
woman 90.4
group 85.6
furniture 81.7
nostalgia 78.1
three 77.1
room 76.8
indoors 76.6
group together 74.4
two 73.7
child 72.6
black and white 71.7
street 66.9
administration 66

Imagga
created on 2021-12-14

man 30.2
musical instrument 28.9
male 27.6
percussion instrument 26.6
person 24.2
people 23.4
business 23.1
table 22.8
work 20.6
chair 19.7
office 19.3
sitting 18.9
businessman 18.5
adult 18.2
professional 17.9
happy 16.9
job 16.8
day 16.5
laptop 16.4
smiling 15.9
stringed instrument 15.7
men 15.4
room 14.1
indoors 14
smile 13.5
vibraphone 13.2
lifestyle 13
group 12.9
looking 12.8
women 12.6
teacher 12.5
marimba 12.3
education 12.1
corporate 12
worker 11.9
product 11.6
interior 11.5
couple 11.3
mature 11.1
casual 11
portrait 11
team 10.7
modern 10.5
businesspeople 10.4
meeting 10.4
teamwork 10.2
furniture 10.1
confident 10
executive 10
holding 9.9
computer 9.7
technology 9.6
desk 9.6
standing 9.6
device 9.3
equipment 9
grand piano 9
cheerful 8.9
working 8.8
medical 8.8
newspaper 8.8
home 8.8
architecture 8.6
hospital 8.6
glass 8.6
construction 8.5
piano 8.5
doctor 8.5
building 8.4
patient 8.4
old 8.4
house 8.4
color 8.3
indoor 8.2
businesswoman 8.2
hall 8.1
suit 8.1
keyboard instrument 8
together 7.9
happiness 7.8
casual clothing 7.8
40s 7.8
colleagues 7.8
chemistry 7.7
attractive 7.7
two 7.6
life 7.6
manager 7.4
creation 7.4
board 7.4
student 7.4
classroom 7.4
inside 7.4
builder 7.3
occupation 7.3
20s 7.3
restaurant 7.3
success 7.2
engineer 7.1
handsome 7.1
drawing 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-43
Gender Male, 58.9%
Calm 83.1%
Happy 10.6%
Angry 2.9%
Sad 2.3%
Confused 0.6%
Disgusted 0.2%
Surprised 0.2%
Fear 0%

AWS Rekognition

Age 17-29
Gender Female, 58.2%
Calm 67.4%
Sad 17.8%
Happy 9.3%
Confused 2.4%
Angry 1.1%
Surprised 1%
Fear 0.6%
Disgusted 0.3%

AWS Rekognition

Age 38-56
Gender Male, 55.6%
Happy 49.4%
Sad 27.5%
Calm 17.3%
Confused 3.3%
Fear 1.2%
Angry 0.7%
Surprised 0.4%
Disgusted 0.2%

AWS Rekognition

Age 16-28
Gender Female, 69.6%
Calm 61.1%
Happy 33.9%
Angry 2.6%
Sad 1.7%
Confused 0.2%
Surprised 0.2%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 35-51
Gender Male, 71.9%
Calm 82.7%
Happy 10.1%
Sad 5%
Angry 1%
Fear 0.5%
Confused 0.2%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 13-25
Gender Female, 68%
Happy 45.6%
Calm 28.5%
Sad 14.1%
Confused 5%
Fear 3.4%
Angry 1.5%
Surprised 1.2%
Disgusted 0.6%

AWS Rekognition

Age 14-26
Gender Male, 66.5%
Calm 88%
Happy 7.4%
Sad 3.4%
Fear 0.3%
Angry 0.3%
Confused 0.2%
Disgusted 0.2%
Surprised 0.1%

Feature analysis

Amazon

Person 99.7%
Chair 70%

Categories

Imagga

paintings art 99.8%

Captions

Text analysis

Amazon

WORLD
WAR
VETERANS
ITALIAN
25205
VESS
ITALIAN WORLD WAR VETERANS No.1
TTALL
1m18 TTALL
No.1
1m18

Google

AL TIALL HALIAN WORLD WAR VETERANS NO1 2 S 2 ) S
HALIAN
2
S
AL
TIALL
WORLD
WAR
VETERANS
NO1
)