Human Generated Data

Title

Untitled (people with suitcases outside of bus depot)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14354

Human Generated Data

Title

Untitled (people with suitcases outside of bus depot)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99.7
Person 99.7
Person 99.6
Person 99.5
Person 99.5
Person 98.8
Person 98.2
Person 97.9
Shorts 97.5
Clothing 97.5
Apparel 97.5
Person 97.4
Person 95.1
Transportation 93.7
Automobile 93.7
Car 93.7
Vehicle 93.7
Fitness 85.2
Sport 85.2
Sports 85.2
Exercise 85.2
Working Out 85.2
Female 81.5
Pedestrian 80.6
Suit 70.1
Coat 70.1
Overcoat 70.1
Woman 68.8
People 66.6
Path 64
Flooring 60.6
Kid 59.2
Child 59.2
Teen 59.2
Girl 59.2
Blonde 59.2
Face 58.1
Gym 57.3
Sleeve 55.5
Drawing 55
Art 55
Person 54.5

Imagga
created on 2022-01-29

people 19.5
person 16.3
man 14.1
sport 13.6
transportation 13.4
transport 11.9
urban 11.4
male 11.3
day 11
field 10.9
chair 10.8
adult 10.4
athlete 10.4
vehicle 10.3
device 9.9
newspaper 9.8
black 9.6
gymnasium 9.6
men 9.4
motion 9.4
travel 9.1
business 9.1
equipment 9
road 9
sky 8.9
working 8.8
structure 8.8
symbol 8.7
water 8.7
product 8.4
portrait 8.4
seat 8.4
city 8.3
room 8.3
speed 8.2
building 8.2
active 8.1
crutch 7.7
summer 7.7
train 7.7
old 7.7
automobile 7.7
health 7.6
walking 7.6
fun 7.5
street 7.4
exercise 7.3
lifestyle 7.2
car 7.2
activity 7.2
staff 7.1
interior 7.1
modern 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

street 92.6
text 92.4
black and white 92
person 81.1
people 60.2

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 98.7%
Surprised 0.4%
Sad 0.4%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Calm 77%
Fear 6.3%
Surprised 5.1%
Confused 2.8%
Sad 2.8%
Happy 2.6%
Angry 2.2%
Disgusted 1.2%

AWS Rekognition

Age 27-37
Gender Male, 96.6%
Calm 72.3%
Happy 14.7%
Fear 6.1%
Sad 3.2%
Angry 1.9%
Surprised 1%
Disgusted 0.6%
Confused 0.3%

AWS Rekognition

Age 23-31
Gender Male, 85.3%
Sad 25.1%
Surprised 24.3%
Calm 21.9%
Fear 9.6%
Angry 6.7%
Disgusted 5.4%
Confused 4.4%
Happy 2.5%

AWS Rekognition

Age 22-30
Gender Male, 95.7%
Happy 39.7%
Fear 28.9%
Surprised 10.1%
Calm 6.3%
Sad 4.5%
Confused 4.1%
Disgusted 3.8%
Angry 2.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.7%
Car 93.7%

Captions

Microsoft

a group of people standing in front of a building 71.4%
a group of people in front of a building 71.3%
a group of people that are standing in front of a building 62.3%

Text analysis

Amazon

DEPOT
U
OPEN
AGE
UNION
Bank
S
U S B
B
a
A70A
MARAT
le
MJI7 YE37AS A70A
MJI7
PL
le 7
YE37AS
ALMAR
7
e
an
I
HPAPT

Google

DEPOT MJ17 YT37A2 AA US ACE
AA
US
YT37A2
DEPOT
MJ17
ACE