Human Generated Data

Title

New York City

Date

1950, printed later

People

Artist: Louis Faurer, American 1916 - 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.282

Copyright

© Estate of Louis Faurer

Human Generated Data

Title

New York City

People

Artist: Louis Faurer, American 1916 - 2001

Date

1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.282

Copyright

© Estate of Louis Faurer

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Wheel 99
Machine 99
Clothing 98.3
Apparel 98.3
Person 98.3
Human 98.3
Car 97.9
Automobile 97.9
Vehicle 97.9
Transportation 97.9
Poster 90.9
Advertisement 90.9
Coat 89.5
Overcoat 89.5
Brochure 87.4
Paper 87.4
Flyer 87.4
Wheel 87.2
Car 86.8
Suit 83.8
Car 81.7
Tire 78
Car Wheel 68.9
Text 63.4
Sleeve 62.3
Spoke 58.2

Clarifai
created on 2023-10-25

people 99.8
vehicle 98.6
car 98.3
monochrome 98
man 97.8
street 97.5
adult 97.2
portrait 96.6
one 96.2
transportation system 95.1
two 91.5
airport 90.1
outerwear 88.4
retro 88
group 87.7
group together 86.4
police 86.2
driver 85.8
wear 85.2
administration 83.8

Imagga
created on 2022-01-08

screen 37.4
background 34.5
monitor 28
television 27.3
equipment 27.1
display 26.9
electronic equipment 25.1
technology 23.7
computer 21.8
car 19.1
digital 18.6
business 17
man 16.8
people 16.7
device 16.2
broadcasting 15.6
person 15.4
electronic device 14.3
hand 13.7
communication 13.4
transportation 12.5
automobile 12.4
design 12.4
loudspeaker 12.2
transport 11.9
work 11.8
bright 11.4
vibrant 11.4
adult 11
3d 10.8
cheering 10.8
silhouette 10.8
businessman 10.6
patriotic 10.5
auto 10.5
television camera 10.4
effects 10.4
lights 10.2
drive 10.1
flag 10.1
speed 10.1
male 9.9
telecommunication 9.9
sport 9.9
audience 9.7
stadium 9.7
crowd 9.6
nation 9.5
symbol 9.4
three dimensional 9.3
electronic 9.3
vehicle 9.3
professional 9.3
training 9.2
glowing 9.2
graphics 9.1
data 9.1
road 9
billboard 9
color 8.9
information 8.8
interior 8.8
job 8.8
nighttime 8.8
driver 8.7
shiny 8.7
media 8.6
player 8.5
old 8.4
television equipment 8.4
signboard 8.2
working 7.9
web site 7.9
black 7.8
court 7.8
championship 7.8
match 7.7
modern 7.7
skill 7.7
muscular 7.6
system 7.6
athlete 7.6
wheel 7.5
field 7.5
camera 7.4
park 7.4
event 7.4
street 7.4
occupation 7.3
competition 7.3
connection 7.3
office 7.3
film 7.1
icon 7.1
science 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.7
land vehicle 98.8
vehicle 98.7
car 98
wheel 94.2
black and white 78.5
person 56.5
man 54.6
tire 52.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Male, 100%
Fear 84.1%
Calm 10.7%
Confused 2.1%
Angry 0.9%
Sad 0.9%
Surprised 0.7%
Happy 0.4%
Disgusted 0.2%

AWS Rekognition

Age 13-21
Gender Male, 51.2%
Happy 59.2%
Calm 29%
Sad 6.6%
Fear 2.1%
Confused 1.2%
Angry 0.6%
Surprised 0.6%
Disgusted 0.5%

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99%
Person 98.3%
Car 97.9%

Categories

Imagga

interior objects 99.3%

Captions

Microsoft
created on 2022-01-08

graphical user interface 50.2%

Text analysis

Amazon

.F
OKY VIEW

Google

STVI
STVI