Human Generated Data

Title

Untitled (three photographs: woman at typewriter; woman with tractor toy; home economics class)

Date

c. 1940, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6796

Human Generated Data

Title

Untitled (three photographs: woman at typewriter; woman with tractor toy; home economics class)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6796

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 97.3
Person 97.3
Monitor 97.3
Electronics 97.3
Display 97.3
Screen 97.3
LCD Screen 97.3
Person 96.9
Indoors 89.6
Interior Design 89.6
Furniture 88.8
Desk 85.3
Table 85.3
Person 83
Computer 72.3
Face 71.8
Pc 71.1
Collage 69.9
Poster 69.9
Advertisement 69.9
Person 68.7
Person 63.8
Entertainment Center 55.4
Shelf 55.1

Clarifai
created on 2019-11-16

people 99.8
group 99.8
adult 99.1
man 98.1
television 97.6
room 96.4
many 96.4
desk 95.9
woman 95.2
group together 95
furniture 94
education 92.9
several 91
indoors 90.4
screen 89
actor 88.4
three 87
technology 85.4
war 83.2
office 82.7

Imagga
created on 2019-11-16

desk 67.6
table 53
monitor 44.3
office 41.7
computer 40
business 37
furniture 31.1
laptop 30.6
working 29.2
people 28.4
professional 26.5
person 25.2
work 25.1
adult 24.7
businesswoman 24.5
man 24.2
businessman 23.8
corporate 23.2
executive 23
sitting 21.5
male 21.3
technology 20.8
job 20.3
confident 20
worker 19.5
happy 19.4
smiling 18.8
furnishing 18.6
meeting 17
manager 16.8
businesspeople 16.1
electronic equipment 16
education 15.6
notebook 15.5
career 15.1
web site 14.3
women 14.2
screen 14.2
looking 13.6
smile 13.5
workplace 13.3
window 13.1
portrait 12.9
equipment 12.9
student 12.7
modern 12.6
indoors 12.3
keyboard 12.2
room 12
phone 12
communication 11.8
suit 11.7
team 11.6
attractive 11.2
successful 11
indoor 11
employee 10.5
telephone 10.4
home 10.4
showing 10.3
men 10.3
paper 10.2
coffee 10.2
occupation 10.1
hand 9.9
pretty 9.8
partners 9.7
group 9.7
success 9.7
expression 9.4
two 9.3
document 9.3
teamwork 9.3
board 9
interior 8.8
expertise 8.7
collar 8.6
boss 8.6
formal 8.6
college 8.5
design 8.4
study 8.4
service 8.3
lady 8.1
information 8
lifestyle 7.9
busy 7.7
serious 7.6
device 7.5
contemporary 7.5
one 7.5
mature 7.4
cheerful 7.3
display 7.2
cup 7.2
handsome 7.1
face 7.1
center 7.1
television 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99
indoor 95.6
drawing 93.4
cartoon 91.8
person 89.5
sketch 88.7
clothing 83.4
black and white 80.4
furniture 58.1
library 52.1
old 45.4
computer 31.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-31
Gender Female, 52.2%
Confused 45.4%
Surprised 48.4%
Sad 45.8%
Calm 49.6%
Disgusted 45%
Happy 45.1%
Fear 45.5%
Angry 45.2%

AWS Rekognition

Age 23-35
Gender Male, 52.2%
Angry 45.1%
Disgusted 45%
Fear 45%
Sad 45%
Happy 53.5%
Calm 46.2%
Confused 45%
Surprised 45.1%

AWS Rekognition

Age 31-47
Gender Male, 50.1%
Angry 50%
Sad 49.6%
Confused 49.5%
Surprised 49.5%
Happy 49.5%
Disgusted 49.7%
Calm 49.5%
Fear 49.6%

AWS Rekognition

Age 24-38
Gender Female, 50.3%
Angry 49.5%
Happy 49.5%
Calm 49.9%
Confused 49.5%
Disgusted 49.8%
Fear 49.6%
Sad 49.6%
Surprised 49.5%

AWS Rekognition

Age 23-35
Gender Female, 50.4%
Sad 49.5%
Fear 49.5%
Angry 49.6%
Surprised 49.5%
Calm 49.8%
Disgusted 49.9%
Confused 49.5%
Happy 49.6%

AWS Rekognition

Age 23-37
Gender Male, 50%
Calm 50%
Surprised 49.7%
Angry 49.7%
Confused 49.5%
Happy 49.5%
Fear 49.5%
Sad 49.5%
Disgusted 49.5%

AWS Rekognition

Age 13-23
Gender Female, 50.1%
Calm 49.7%
Disgusted 49.6%
Happy 49.5%
Fear 49.5%
Surprised 49.5%
Sad 50.1%
Confused 49.5%
Angry 49.6%

AWS Rekognition

Age 13-23
Gender Male, 50.4%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Confused 49.5%
Calm 49.5%
Fear 49.6%
Sad 50.3%
Surprised 49.5%

AWS Rekognition

Age 50-68
Gender Female, 50.1%
Surprised 49.6%
Disgusted 49.5%
Sad 50%
Angry 49.6%
Calm 49.7%
Happy 49.5%
Fear 49.5%
Confused 49.5%

AWS Rekognition

Age 2-8
Gender Male, 50.1%
Disgusted 49.9%
Angry 49.7%
Confused 49.5%
Calm 49.6%
Sad 49.6%
Happy 49.6%
Surprised 49.5%
Fear 49.6%

AWS Rekognition

Age 23-35
Gender Female, 50.5%
Surprised 49.5%
Disgusted 49.6%
Happy 49.9%
Sad 49.5%
Calm 49.6%
Confused 49.5%
Fear 49.5%
Angry 49.8%

AWS Rekognition

Age 29-45
Gender Female, 50%
Angry 49.6%
Happy 49.7%
Disgusted 49.6%
Fear 49.6%
Confused 49.6%
Surprised 49.5%
Calm 49.8%
Sad 49.6%

AWS Rekognition

Age 23-35
Gender Female, 50.1%
Happy 49.5%
Fear 49.6%
Sad 49.7%
Confused 49.6%
Disgusted 49.5%
Angry 49.8%
Calm 49.8%
Surprised 49.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%

Categories

Imagga

paintings art 73.6%
food drinks 23.1%
text visuals 1.8%

Text analysis

Google

TTY
PA
TTY PA