Human Generated Data

Title

Untitled (man and woman standing outside car with luggage)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8167

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman standing outside car with luggage)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8167

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.8
Human 98.8
Person 98.7
Clothing 79.9
Apparel 79.9
Shorts 66.6
Vehicle 65.5
Transportation 65.5
Car 63.9
Automobile 63.9
Text 61.5
Housing 56.9
Building 56.9
Car 55.7

Clarifai
created on 2023-10-25

people 99.9
vehicle 99.8
transportation system 99
adult 98.9
group together 98.8
woman 98
group 97.9
two 97.2
car 95.4
three 95.2
wear 95.1
child 95
man 94.9
monochrome 94.2
four 93.6
watercraft 93.6
sitting 93.4
furniture 92.9
several 92.3
seat 91.6

Imagga
created on 2022-01-08

laptop 29.8
computer 28.6
work 26.7
working 26.5
person 25.2
people 23.4
adult 22.7
man 22.2
business 21.2
worker 20.5
technology 20
vehicle 19.2
car 18.1
job 16.8
office 16.7
professional 15.3
male 14.9
attractive 14.7
automobile 14.3
transportation 14.3
notebook 13.9
engine 13.5
equipment 13.4
sitting 12.9
businesswoman 12.7
one 12.7
smiling 12.3
lifestyle 12.3
men 12
portrait 11.6
device 11.6
repair 11.5
auto 11.5
hand 11.4
happy 11.3
occupation 11
communication 10.9
tool 10.7
smile 10.7
driver 10.7
travel 10.6
pretty 10.5
hair 10.3
phone 10.1
drive 9.6
home 9.6
wireless 9.5
corporate 9.4
engineer 9.3
transport 9.1
modern 9.1
holding 9.1
clothing 9
cheerful 8.9
lady 8.9
businessman 8.8
women 8.7
driving 8.7
model 8.5
musical instrument 8.5
casual 8.5
executive 8.3
uniform 8.3
handsome 8
desk 8
interior 8
employee 7.8
typing 7.8
motor 7.7
outdoor 7.6
wheel 7.5
mobile 7.5
keyboard 7.5
help 7.4
monitor 7.4
safety 7.4
protection 7.3
looking 7.2
body 7.2
face 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.8
black and white 79.6
clothing 57.7
person 53.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 57.4%
Happy 91.8%
Calm 4.8%
Sad 1.6%
Fear 0.5%
Confused 0.4%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 30-40
Gender Female, 97%
Calm 71%
Happy 18.8%
Surprised 6%
Angry 1.8%
Confused 0.8%
Sad 0.7%
Disgusted 0.7%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Car 63.9%

Categories

Imagga

paintings art 99.8%

Captions

Microsoft
created on 2022-01-08

text 59.2%

Text analysis

Amazon

390
390 82-A.
82-A.
-
DVD
asics
FBC

Google

390
82-A.
390 82-A.