Human Generated Data

Title

Untitled (two photographs: three young women working at typewriters as two others watch; three high school students doing German exercises at blackboard)

Date

c. 1950-1960, printed later

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11154

Human Generated Data

Title

Untitled (two photographs: three young women working at typewriters as two others watch; three high school students doing German exercises at blackboard)

People

Artist: Claseman Studio, American 20th century

Date

c. 1950-1960, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11154

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Poster 100
Collage 100
Advertisement 100
Human 99.3
Person 99.3
Person 98.8
Person 97.9
Person 95.6
Person 91.6
Person 90.8
Person 89.5
Person 88.4
Person 80.3
Coat 63.4
Apparel 63.4
Clothing 63.4
Overcoat 63.4
Suit 63.4
Furniture 58.4
Table 58.4
Tabletop 56.7

Clarifai
created on 2019-11-16

people 99.9
group 99.3
adult 99.2
man 98
vehicle 97.2
furniture 97
administration 96.4
group together 95.8
many 95.5
room 95.1
woman 93.8
war 93.6
one 92.8
wear 91.7
leader 91.3
two 90.9
indoors 90.2
several 90.1
outfit 88.9
military 88.5

Imagga
created on 2019-11-16

business 22.5
money 20.4
cassette tape 19.5
container 19.4
paper 18.9
currency 18.8
finance 17.7
cash 16.5
magnetic tape 15.9
bill 15.2
financial 15.1
device 14.9
old 14.6
dollar 13
note 12.9
wealth 12.6
memory device 12.3
case 12.2
banking 11.9
card 11.9
vintage 11.8
bank 11.6
black 11.4
cassette 11.3
book 11.2
retro 10.6
exchange 10.5
design 10.3
equipment 10.1
office 9.7
success 9.6
savings 9.3
investment 9.2
frame 9.1
message 9.1
people 8.9
working 8.8
hundred 8.7
dollars 8.7
antique 8.6
window 8.3
pattern 8.2
technology 8.2
object 8.1
close 8
bills 7.8
banknote 7.8
debt 7.7
grunge 7.7
table 7.7
economy 7.4
color 7.2
open 7.2
art 7.2
market 7.1
person 7.1
interior 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 97.4
text 95.7
black and white 93.3
indoor 91.8
person 89.9
clothing 88.6
gallery 63.6
cartoon 61
clock 59.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 54.7%
Angry 45.1%
Sad 45.3%
Calm 53.8%
Disgusted 45.1%
Confused 45.2%
Surprised 45.5%
Fear 45%
Happy 45%

AWS Rekognition

Age 13-23
Gender Female, 54.9%
Fear 45.1%
Angry 45.2%
Calm 53.5%
Happy 45%
Sad 46%
Disgusted 45%
Confused 45.1%
Surprised 45.1%

AWS Rekognition

Age 20-32
Gender Male, 50.1%
Angry 54.6%
Sad 45.3%
Calm 45.1%
Fear 45%
Happy 45%
Surprised 45%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 11-21
Gender Male, 50.8%
Surprised 45%
Sad 54.9%
Calm 45.1%
Happy 45%
Confused 45%
Disgusted 45%
Fear 45%
Angry 45%

AWS Rekognition

Age 11-21
Gender Female, 54.7%
Calm 47.9%
Happy 45.1%
Disgusted 45.1%
Surprised 45.3%
Angry 46.7%
Sad 46.6%
Fear 45.3%
Confused 48.1%

AWS Rekognition

Age 5-15
Gender Female, 53.1%
Angry 45%
Disgusted 45%
Happy 45%
Calm 45.2%
Sad 54.7%
Surprised 45%
Fear 45%
Confused 45%

AWS Rekognition

Age 29-45
Gender Male, 50.4%
Angry 49.5%
Sad 49.5%
Confused 49.5%
Calm 50.4%
Surprised 49.5%
Happy 49.5%
Disgusted 49.6%
Fear 49.5%

AWS Rekognition

Age 26-40
Gender Male, 50.5%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Sad 49.7%
Confused 49.5%
Calm 50.2%
Fear 49.5%
Happy 49.5%

AWS Rekognition

Age 21-33
Gender Male, 50.4%
Surprised 49.5%
Angry 49.5%
Sad 49.5%
Fear 49.5%
Disgusted 49.5%
Happy 49.5%
Calm 50.5%
Confused 49.5%

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

paintings art 99.5%

Text analysis

Amazon

SAFETY
KOOAK SAFETY FILM
FILM
KOOAK
SAFETY FILM KODAK
KODAK
E.A.
L.L.

Google

KODAK SAFETY FILM KODAK SAFETY FILM MJIRY33A 2-rAdOX MER K EA LL.
KODAK
SAFETY
FILM
2-rAdOX
K
LL.
MJIRY33A
MER
EA