Human Generated Data

Title

Untitled (man and woman standing outside car with luggage)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8166

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman standing outside car with luggage)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8166

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.8
Human 98.8
Person 95.3
Airplane 77.3
Transportation 77.3
Vehicle 77.3
Aircraft 77.3
Clothing 76.1
Apparel 76.1
Text 69.4
Furniture 67.1
Chair 67.1
Female 60.2

Clarifai
created on 2023-10-25

people 99.9
vehicle 99.9
transportation system 99.6
adult 99.2
group together 99.1
two 98.6
watercraft 98.5
group 98.2
woman 97.9
three 96.9
man 96.3
child 96.2
car 95.4
sitting 95.2
four 95
monochrome 94.9
several 92.6
one 92.2
aircraft 91.8
recreation 90.9

Imagga
created on 2022-01-08

laptop 46.6
computer 42.1
technology 28.2
work 26.7
office 26.2
notebook 26
business 24.3
working 23.9
equipment 19.7
communication 18.5
people 18.4
worker 17.8
person 17.7
monitor 17.6
keyboard 16.9
adult 16.8
wireless 16.2
man 16.1
corporate 15.5
desk 15.2
professional 15.2
device 15.2
businesswoman 14.5
screen 14.2
car 14.1
modern 14
executive 13.8
sitting 13.7
male 13.5
attractive 12.6
job 12.4
smiling 12.3
happy 11.9
indoors 11.4
vehicle 11.2
typing 10.7
interior 10.6
businessman 10.6
automobile 10.5
home 10.4
table 10.3
phone 10.1
smile 10
hand 9.9
transportation 9.9
motor vehicle 9.9
portrait 9.7
auto 9.6
women 9.5
men 9.4
lifestyle 9.4
network 8.9
information 8.9
success 8.8
portable computer 8.7
room 8.5
mobile 8.5
web 8.5
student 8.1
hair 7.9
black 7.8
sofa 7.7
pretty 7.7
golf equipment 7.7
workplace 7.6
electronic 7.5
back 7.3
road 7.2
personal computer 7.2
display 7.1
wheeled vehicle 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.9
black and white 89.7
computer 56.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 96.7%
Calm 93.6%
Happy 5.7%
Confused 0.3%
Sad 0.2%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Airplane 77.3%

Categories

Captions

Text analysis

Amazon

390
390 82
82
PERSO
to
YТ37A°-AX
V.S

Google

390 82
390
82