Human Generated Data

Title

Untitled (man and woman packing the car with suitcases)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8805

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman packing the car with suitcases)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8805

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
Person 99.3
Clothing 96.6
Apparel 96.6
Shorts 93
Car 90.7
Automobile 90.7
Transportation 90.7
Vehicle 90.7
Female 79.3
Building 72
Outdoors 71.1
Pants 70.8
Plant 67.1
Portrait 66.8
Photo 66.8
Photography 66.8
Face 66.8
Woman 65.1

Clarifai
created on 2023-10-26

people 99.9
vehicle 99
street 98.1
adult 97.7
car 97.5
monochrome 97.4
two 97.3
man 96.4
transportation system 95
one 95
three 93.3
group together 92.3
woman 91.2
group 86.5
police 83.8
actor 83.6
administration 82.3
four 81
several 80.4
driver 78.7

Imagga
created on 2022-01-09

barber chair 55.5
chair 50.8
seat 36.4
sidecar 26.8
conveyance 24.8
furniture 24.7
street 20.2
city 18.3
transportation 16.1
black 15.6
vehicle 15.2
urban 14.8
architecture 14.8
building 14.8
old 14.6
people 14.5
travel 14.1
car 13.7
man 13.4
transport 12.8
road 12.6
shop 12.4
furnishing 11.3
business 10.3
motor scooter 10.2
barbershop 9.5
men 9.4
male 9.2
wheeled vehicle 9.1
scene 8.6
windows 8.6
automobile 8.6
auto 8.6
traffic 8.5
industry 8.5
person 8.4
house 8.4
room 8.3
equipment 8.3
tourism 8.2
style 8.2
cars 7.8
art 7.8
drive 7.6
vintage 7.4
town 7.4
device 7.4
back 7.3
historic 7.3
window 7.3
tourist 7.2
modern 7

Microsoft
created on 2022-01-09

text 98.7
black and white 95
person 86.4
living 82.3
street 82.1
vehicle 72.7
land vehicle 72.4
monochrome 70.9
car 66.3
wheel 64.2
waste container 55.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 80%
Calm 99.7%
Happy 0.2%
Sad 0.1%
Disgusted 0%
Surprised 0%
Confused 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Car 90.7%

Categories

Text analysis

Amazon

394
10
394 80-A
80-A
RIDA
STATE
V STATE DI
DI
V

Google

39480-A. M 7--YT33A°2--XAGO
39480-A.
M
7--YT33A°2--XAGO