Human Generated Data

Title

Untitled (man and woman with cars on street)

Date

c. 1950

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18963

Human Generated Data

Title

Untitled (man and woman with cars on street)

People

Artist: Bachrach Studios, founded 1868

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.7
Person 99.7
Person 99.4
Apparel 98.9
Clothing 98.9
Wheel 98.7
Machine 98.7
Car 97.5
Automobile 97.5
Vehicle 97.5
Transportation 97.5
Town 95.5
Metropolis 95.5
City 95.5
Building 95.5
Urban 95.5
Car 81.3
Countryside 65.6
Nature 65.6
Shelter 65.6
Outdoors 65.6
Rural 65.6
Female 64.8
Person 64.3
Person 63.6
Shorts 63.5
Downtown 63.3
Coat 57.5
Architecture 55.9
Dress 55.9

Imagga
created on 2022-03-05

people 27.9
man 24.9
person 20.6
adult 20.1
chair 18.9
barber chair 18.9
male 18.5
men 18
hospital 15.6
home 15.1
city 15
seat 14.9
street 14.7
building 14.6
business 14.6
work 14.1
professional 13.6
nurse 13.5
passenger 13.3
medical 13.2
architecture 12.6
room 12.4
device 12.1
patient 12
travel 12
happy 11.9
electrical device 11.9
worker 11.5
medicine 11.4
smile 11.4
health 11.1
mask 10.5
gate 10.4
doctor 10.3
women 10.3
smiling 10.1
job 9.7
interior 9.7
businessman 9.7
indoors 9.7
urban 9.6
turnstile 9.5
culture 9.4
furniture 9.4
inside 9.2
house 9.2
modern 9.1
portrait 9.1
dress 9
team 9
family 8.9
surgery 8.8
walking 8.5
black 8.4
historic 8.2
staff 8.2
outdoors 8.2
crutch 8.1
lady 8.1
transportation 8.1
religion 8.1
shop 8.1
working 7.9
lifestyle 7.9
clinic 7.9
day 7.8
tourist 7.8
emergency 7.7
old 7.7
profession 7.7
adults 7.6
care 7.4
back 7.3
office 7.2
surgeon 7.2
clothing 7
equipment 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.6
street 98
black and white 95.3
vehicle 94.8
land vehicle 92.3
wheel 82.4
monochrome 80.9
car 78.7
waste container 66.7

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Happy 59.1%
Calm 23.8%
Surprised 7.5%
Sad 7.4%
Confused 1.1%
Disgusted 0.8%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 35-43
Gender Male, 90.5%
Calm 55.5%
Happy 37.8%
Surprised 3.6%
Sad 1.5%
Fear 0.9%
Disgusted 0.3%
Confused 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 98.7%
Car 97.5%

Captions

Microsoft

a person standing in front of a store window 65%
a person standing in front of a window 61%
a person sitting in front of a store window 46.7%

Text analysis

Amazon

SAFETY
KODAK
3
E
M
M E 9
9
:

Google

SAFETY
KŐĎAK
KŐĎAK SAFETY