Human Generated Data

Title

Untitled (military man and civilians in car amidst crowd)

Date

1959, printed later

People

Artist: Lester Cole, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.512

Human Generated Data

Title

Untitled (military man and civilians in car amidst crowd)

People

Artist: Lester Cole, American

Date

1959, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.512

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.2
Human 99.2
Person 98.5
Person 95.8
Person 92.4
Person 92
Sunglasses 91.8
Accessories 91.8
Accessory 91.8
People 81.5
Clothing 80.3
Apparel 80.3
Face 76.6
Suit 62.5
Coat 62.5
Overcoat 62.5
Poster 62
Advertisement 62
Person 59
Crowd 57.9
Urban 56.9
Collage 56.2
Finger 55.7

Clarifai
created on 2023-10-15

people 100
group 98.9
adult 98.8
group together 98.4
administration 97.2
leader 97
vehicle 96.7
woman 96.7
many 96.4
monochrome 96.3
street 96.3
man 96.1
transportation system 94.9
police 92
war 89.6
several 88
portrait 85.4
boy 81.9
convertible 81.8
child 81.5

Imagga
created on 2021-12-14

barbershop 32.5
shop 28.2
building 26.6
architecture 24.3
city 23.3
mercantile establishment 20.9
travel 15.5
house 15
man 14.8
business 14
place of business 13.9
street 13.8
people 12.3
town 12.1
old 11.8
window 11.5
astronaut 11.2
history 10.7
male 10.6
adult 10.4
famous 10.2
historic 10.1
sculpture 9.9
statue 9.7
urban 9.6
person 9.3
tourism 9.1
landmark 9
home 8.8
facade 8.7
office 8.6
finance 8.4
black 8.4
art 8.1
structure 8
work 7.8
car 7.8
iron lung 7.8
men 7.7
money 7.7
historical 7.5
device 7.5
destination 7.5
symbol 7.4
bank 7.2
religion 7.2
modern 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.6
person 96.6
clothing 95
man 93.7
black and white 77
people 65.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-44
Gender Male, 99.1%
Happy 99.1%
Fear 0.2%
Calm 0.2%
Disgusted 0.2%
Angry 0.2%
Surprised 0.1%
Confused 0.1%
Sad 0%

AWS Rekognition

Age 33-49
Gender Male, 95.7%
Happy 97.9%
Calm 0.9%
Fear 0.4%
Angry 0.3%
Surprised 0.2%
Sad 0.1%
Confused 0.1%
Disgusted 0%

AWS Rekognition

Age 30-46
Gender Female, 98.7%
Calm 81.7%
Angry 10.5%
Disgusted 2.5%
Sad 1.5%
Happy 1.3%
Surprised 1.2%
Confused 0.7%
Fear 0.6%

AWS Rekognition

Age 22-34
Gender Female, 57.2%
Happy 99.7%
Surprised 0.1%
Disgusted 0.1%
Calm 0%
Angry 0%
Fear 0%
Confused 0%
Sad 0%

AWS Rekognition

Age 22-34
Gender Male, 87.3%
Fear 46.1%
Angry 17.4%
Calm 16.8%
Surprised 11.4%
Sad 3.8%
Happy 2.9%
Confused 1.1%
Disgusted 0.4%

AWS Rekognition

Age 23-37
Gender Male, 99.5%
Calm 89.6%
Happy 5.5%
Sad 2.1%
Fear 0.8%
Angry 0.8%
Surprised 0.5%
Disgusted 0.3%
Confused 0.3%

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Sunglasses 91.8%

Text analysis

Amazon

PHILIPS
M-
UNI

Google

PHILIPS GO
PHILIPS
GO