Human Generated Data

Title

Untitled (group of students posed reading yearbooks and working at typewriters on long table)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6620

Human Generated Data

Title

Untitled (group of students posed reading yearbooks and working at typewriters on long table)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6620

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Person 99.6
Human 99.6
Person 99.5
Person 98.9
Classroom 98.8
School 98.8
Room 98.8
Indoors 98.8
Person 98.3
Person 98
Person 96.7
Person 96.2
Person 95.8
Person 93.8
Person 88.4
Furniture 86
Chair 85.5
Person 83.6
Person 82.4
Table 78.3
Clinic 74.3
Person 71
Shoe 70.5
Apparel 70.5
Footwear 70.5
Clothing 70.5
Person 65.3
People 60.2
Lab 58
Workshop 56.8
Desk 55.4
Person 54.6
Person 44.2

Clarifai
created on 2019-03-25

people 99.9
group 99.7
group together 99.3
adult 97.6
administration 97.1
many 96.8
leader 96.2
education 95.5
teacher 94.7
man 94.7
several 93.8
woman 92.8
chair 91.7
meeting 91.2
furniture 90
child 86
league 85.9
room 85.6
classroom 84.8
desk 80.9

Imagga
created on 2019-03-25

room 48.7
classroom 47.1
musical instrument 41.2
marimba 39.6
percussion instrument 37.2
male 34
people 32.9
man 31.6
teacher 31.3
table 29.7
person 29.4
businessman 23.8
adult 23.4
business 23.1
indoors 22.8
men 21.5
women 21.3
meeting 20.7
office 20.3
smiling 20.2
group 20.1
sitting 19.8
work 19.6
interior 19.5
brass 19.4
chair 18.9
home 18.3
professional 17.9
communication 16.8
businesswoman 16.4
corporate 16.3
happy 16.3
team 16.1
wind instrument 15.9
together 15.8
education 15.6
modern 15.4
teamwork 14.8
executive 14.3
businesspeople 14.2
holding 14
student 14
indoor 13.7
desk 13.4
talking 13.3
educator 13.2
worker 12.4
school 12.3
learning 12.2
boy 12.2
mature 12.1
happiness 11.7
portrait 11.6
lifestyle 11.6
job 11.5
drinking 11.5
cheerful 11.4
couple 11.3
study 11.2
restaurant 10.9
child 10.9
cornet 10.8
conference 10.7
teaching 10.7
confident 10
kid 9.7
working 9.7
colleagues 9.7
workplace 9.5
friends 9.4
glass 9.3
smile 9.3
laptop 9.2
house 9.2
blackboard 9.1
board 9
suit 9
handsome 8.9
family 8.9
senior 8.4
friendship 8.4
drink 8.3
hall 8.3
coffee 8.3
inside 8.3
successful 8.2
kitchen 8
looking 8
to 8
diverse 7.8
standing 7.8
students 7.8
party 7.7
class 7.7
30s 7.7
two 7.6
dining 7.6
enjoying 7.6
eating 7.6
togetherness 7.5
presentation 7.4
document 7.4
book 7.3
design 7.3
children 7.3
new 7.3
color 7.2
love 7.1
employee 7

Google
created on 2019-03-25

Photograph 95.5
Room 74.4
Table 71.7
Team 67.7
Photography 62.4
Black-and-white 56.4

Microsoft
created on 2019-03-25

person 98
group 65.2
old 42
school 23.4
ballet 18.6
library 18.3
black and white 11.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 51.4%
Happy 45.3%
Disgusted 45.2%
Calm 53.2%
Surprised 45.6%
Confused 45.2%
Angry 45.3%
Sad 45.3%

AWS Rekognition

Age 26-44
Gender Male, 51.9%
Happy 46.7%
Sad 47.9%
Angry 45.8%
Disgusted 45.6%
Surprised 45.5%
Calm 47.9%
Confused 45.6%

AWS Rekognition

Age 26-43
Gender Female, 50.9%
Confused 45.6%
Sad 47.5%
Calm 49.6%
Happy 45.4%
Surprised 45.4%
Angry 45.8%
Disgusted 45.7%

AWS Rekognition

Age 57-77
Gender Male, 50.2%
Disgusted 45%
Surprised 45.2%
Confused 45.1%
Angry 45.5%
Sad 47.4%
Calm 51.8%
Happy 45.1%

AWS Rekognition

Age 26-44
Gender Male, 54.3%
Sad 45.8%
Confused 45.1%
Angry 45.1%
Surprised 45.1%
Calm 53.9%
Happy 45.1%
Disgusted 45%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Angry 45.6%
Happy 45.7%
Sad 49.1%
Confused 45.5%
Disgusted 45.1%
Calm 48.5%
Surprised 45.4%

AWS Rekognition

Age 26-44
Gender Female, 52.5%
Calm 46.2%
Disgusted 45.1%
Confused 45.4%
Happy 51.4%
Surprised 45.3%
Angry 45.4%
Sad 46.2%

AWS Rekognition

Age 26-43
Gender Male, 54.5%
Angry 45.5%
Happy 45.2%
Sad 46.9%
Confused 45.3%
Disgusted 45.2%
Surprised 45.3%
Calm 51.5%

AWS Rekognition

Age 26-43
Gender Female, 51%
Confused 45.3%
Angry 47%
Surprised 45.5%
Happy 45.3%
Calm 51.3%
Disgusted 45.2%
Sad 45.4%

AWS Rekognition

Age 35-52
Gender Female, 50.9%
Sad 45.6%
Angry 45.2%
Calm 48.6%
Surprised 45.1%
Happy 50.3%
Confused 45.1%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Surprised 45.5%
Sad 45.3%
Disgusted 50.9%
Happy 45.6%
Confused 45.3%
Calm 46.8%
Angry 45.6%

AWS Rekognition

Age 38-57
Gender Female, 51.5%
Surprised 45.1%
Angry 45.1%
Sad 45.4%
Calm 54.1%
Happy 45.1%
Confused 45.1%
Disgusted 45.1%

AWS Rekognition

Age 35-55
Gender Female, 51%
Surprised 45.8%
Happy 45.2%
Calm 51.6%
Sad 46.7%
Angry 45.4%
Disgusted 45.1%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Female, 53.2%
Disgusted 45.3%
Calm 49.5%
Sad 48.5%
Confused 45.3%
Happy 45.2%
Surprised 45.6%
Angry 45.6%

Feature analysis

Amazon

Person 99.6%
Chair 85.5%
Shoe 70.5%

Categories

Text analysis

Amazon

YT33A2-
23
-XAON