Human Generated Data

Title

Untitled (passerbys watching street portrait artist at work)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15854

Human Generated Data

Title

Untitled (passerbys watching street portrait artist at work)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15854

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.6
Human 98.6
Person 97.2
Person 94.5
Person 94.4
Car 90.4
Transportation 90.4
Vehicle 90.4
Automobile 90.4
Person 90.3
Person 88.4
Furniture 75.5
Clothing 72
Apparel 72
Advertisement 67.6
Poster 67.5
Face 63
Female 62.7
Clinic 60.3
Table 56
Indoors 55.8
Text 55.3
Person 51.8

Clarifai
created on 2023-10-29

people 99.9
monochrome 98.7
group 98.4
adult 97.5
canine 97.3
two 96.2
administration 94.4
three 93.9
group together 93.6
leader 93.4
woman 93.1
four 91.4
one 91.1
several 89.9
vehicle 89.6
commerce 89.6
education 89.4
man 88.9
child 87.6
room 87.6

Imagga
created on 2022-02-05

shop 23.4
equipment 23.1
barbershop 21.2
monitor 19.9
newspaper 18.9
technology 17.1
refrigerator 16.7
table 16.7
window 15.6
computer 15.5
room 15.5
home 15.1
working 15
work 14.9
mercantile establishment 14.9
desk 14.3
house 14.2
architecture 14
people 13.9
white goods 13.4
product 13.4
interior 13.3
electronic equipment 13.2
building 13
office 12.7
old 12.5
furniture 12
city 11.6
medical 11.5
man 11.4
indoors 11.4
worker 10.7
home appliance 10.6
creation 10.6
business 10.3
industry 10.2
hospital 10
place of business 9.9
modern 9.8
medicine 9.7
wall 9.4
appliance 9.4
center 9.4
uniform 9.1
device 8.9
metal 8.8
clinic 8.8
glass 8.5
person 8.5
design 8.4
vintage 8.3
machine 8.2
digital 8.1
male 7.8
travel 7.7
men 7.7
professional 7.7
chair 7.7
health 7.6
lamp 7.6
vehicle 7.5
inside 7.4
lifestyle 7.2
screen 7.1
science 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.5
window 99.2
indoor 91.7
person 91.6
clothing 89.7
black and white 73.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 60.9%
Calm 96.9%
Sad 2.1%
Disgusted 0.3%
Surprised 0.2%
Angry 0.2%
Confused 0.2%
Fear 0.1%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Person 98.6%
Person 97.2%
Person 94.5%
Person 94.4%
Person 90.3%
Person 88.4%
Person 51.8%
Car 90.4%

Text analysis

Amazon

Framing
Prints
Fine
Fine Prints s Framing
s

Google

Fine Prints & FrGming
Fine
Prints
&
FrGming