Human Generated Data

Title

Untitled (women typing)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20161

Human Generated Data

Title

Untitled (women typing)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20161

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 99.4
Sunglasses 94
Accessories 94
Accessory 94
Electronics 86.7
Home Decor 84.4
LCD Screen 82.9
Monitor 82.9
Screen 82.9
Display 82.9
Table 75.6
Furniture 75.6
Pc 73.3
Computer 73.3
Desk 71.2
Interior Design 69.4
Indoors 69.4
Girl 63.7
Female 63.7
Face 63.7
Portrait 62.9
Photography 62.9
Photo 62.9
Brick 55.6

Clarifai
created on 2023-10-22

people 98.9
desk 98.7
group 98.2
adult 97
woman 96.5
furniture 96.5
room 96.5
man 96
indoors 95.7
group together 95.4
technology 92.5
two 90.7
three 89.4
sit 88.5
employee 88
office 86.7
microphone 85.4
music 81.8
sitting 81.5
education 81.4

Imagga
created on 2022-03-05

man 34.9
office 29.4
people 29
male 25.5
person 25.2
business 24.9
businessman 23.8
room 22.3
table 21.8
meeting 21.6
work 21.2
men 19.7
desk 18.9
nurse 18.6
adult 18.5
indoors 17.5
worker 17.1
team 17
smiling 16.6
happy 16.3
home 15.9
working 15.9
teamwork 15.7
sitting 15.4
businesswoman 15.4
kitchen 14.7
group 14.5
lifestyle 13.7
colleagues 13.6
modern 13.3
businesspeople 13.3
job 13.3
professional 13
computer 12.9
executive 12.5
talking 12.3
laptop 12
corporate 12
house 11.7
medical 11.5
together 11.4
women 11.1
conference 10.7
smile 10.7
mid adult 10.6
30s 10.6
chair 10.3
holding 9.9
restaurant 9.8
coworkers 9.8
cheerful 9.7
clinic 9.7
portrait 9.7
doctor 9.4
furniture 9.3
manager 9.3
presentation 9.3
building 9.2
education 8.6
happiness 8.6
casual 8.5
senior 8.4
mature 8.4
coat 8.3
color 8.3
student 8.3
20s 8.2
patient 8.1
suit 8.1
success 8
hospital 8
interior 8
two people 7.8
glass 7.8
employee 7.7
chemistry 7.7
four 7.7
plan 7.6
communication 7.5
horizontal 7.5
teacher 7.4
coffee 7.4
successful 7.3
indoor 7.3
new 7.3
classroom 7.2
looking 7.2
machine 7.1
science 7.1
face 7.1
shop 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

window 99.9
computer 98
laptop 94.3
indoor 93.5
person 82.8
text 74.8
desk 70
black and white 62.8
clothing 60.5
office 55.6
table 28

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 99.8%
Happy 87.5%
Calm 5.2%
Disgusted 1.7%
Sad 1.7%
Confused 1.2%
Angry 1.2%
Surprised 0.8%
Fear 0.6%

AWS Rekognition

Age 22-30
Gender Female, 99.8%
Calm 93.6%
Sad 2.7%
Surprised 1.4%
Fear 0.9%
Angry 0.5%
Happy 0.5%
Disgusted 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Sunglasses
Person 99.5%
Person 99.4%

Categories

Imagga

interior objects 98.9%

Text analysis

Amazon

18
KODAK-SLA

Google

YT37A2-XA
YT37A2-XA