Human Generated Data

Title

Untitled (wedding guests seated on the floor with flowers)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8593

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests seated on the floor with flowers)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8593

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
Person 98.9
Clothing 97.9
Apparel 97.9
Face 96.9
Person 96.5
Dress 92.1
Person 91
Poster 88.9
Advertisement 88.9
Female 85.8
Collage 83.6
Tie 82.2
Accessories 82.2
Accessory 82.2
People 80.7
Text 69.6
Costume 69.4
Portrait 69.2
Photography 69.2
Photo 69.2
Suit 68.3
Coat 68.3
Overcoat 68.3
Head 68
Woman 66.1
Paper 65.2
Girl 64.4
Outdoors 64.3
Performer 64.2
Glasses 63.8
Shorts 63
Leisure Activities 59.1
Kid 57.6
Child 57.6
Flyer 56.2
Brochure 56.2
Sunglasses 55.7
Person 48
Person 42.5

Clarifai
created on 2023-10-25

people 99.9
group 98.3
many 97.8
group together 97.6
adult 97
man 96.4
woman 96.3
child 92.2
administration 91.8
monochrome 91.2
wear 89.3
street 89
several 88.7
recreation 87.8
leader 87.6
music 87.2
dancing 84.7
facial expression 83.1
musician 79.7
crowd 76.9

Imagga
created on 2022-01-09

man 39.6
male 36.2
businessman 31.8
business 30.4
people 27.9
person 24.6
world 22.2
office 21
adult 19.9
work 18.8
group 18.5
corporate 17.2
newspaper 17.2
professional 17.1
job 16.8
executive 15.3
businesspeople 15.2
meeting 15.1
room 14.8
men 14.6
computer 14.4
working 14.1
manager 14
happy 13.8
businesswoman 13.6
looking 13.6
portrait 13.6
team 13.4
casual 12.7
building 12.3
together 12.3
couple 12.2
education 12.1
teamwork 12
worker 11.7
product 11.7
indoors 11.4
engineer 11.3
senior 11.2
sitting 11.2
technology 11.1
paper 11.1
suit 10.9
conference 10.7
table 10.4
successful 10.1
hand 9.9
human 9.7
discussion 9.7
colleagues 9.7
success 9.7
home 9.6
serious 9.5
plan 9.4
teacher 9.2
communication 9.2
face 9.2
creation 9.2
indoor 9.1
laptop 9.1
patient 8.9
discussing 8.8
employee 8.8
client 8.8
designer 8.7
cooperation 8.7
architect 8.7
smiling 8.7
project 8.7
engineering 8.6
desk 8.5
two 8.5
finance 8.4
horizontal 8.4
occupation 8.2
20s 8.2
student 8
women 7.9
brainstorming 7.9
necktie 7.8
60s 7.8
black 7.8
partner 7.7
class 7.7
construction 7.7
money 7.7
talking 7.6
career 7.6
company 7.4
document 7.4
camera 7.4
design 7.3
new 7.3
color 7.2

Google
created on 2022-01-09

Black 89.5
Black-and-white 86.3
Style 84.1
Font 80.5
Adaptation 79.4
Monochrome photography 79.2
Monochrome 77.4
T-shirt 76.8
Art 76.4
Shorts 71.8
Event 70.6
Hat 70.4
Room 65.6
Photo caption 64.5
Visual arts 63.9
Stock photography 63.4
Crew 62.6
Fun 60.5
Sitting 60.1
Happy 59.2

Microsoft
created on 2022-01-09

person 99.9
text 98.9
clothing 92.3
black and white 85
man 79.3
drawing 64.4
cartoon 60.1
poster 51.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Female, 68.5%
Happy 93.9%
Fear 2%
Surprised 1.5%
Angry 0.9%
Calm 0.8%
Disgusted 0.4%
Confused 0.2%
Sad 0.2%

AWS Rekognition

Age 34-42
Gender Female, 94%
Calm 66.4%
Sad 21.2%
Happy 5.3%
Surprised 2.1%
Confused 1.9%
Fear 1.6%
Disgusted 1.2%
Angry 0.4%

AWS Rekognition

Age 47-53
Gender Male, 96.8%
Surprised 45.6%
Happy 21.5%
Calm 19.6%
Sad 7.1%
Confused 3.7%
Fear 0.9%
Angry 0.9%
Disgusted 0.8%

AWS Rekognition

Age 50-58
Gender Male, 98.4%
Calm 81.2%
Happy 14.3%
Sad 2.4%
Surprised 1.5%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Male, 80.1%
Sad 35.5%
Surprised 33.6%
Fear 8.8%
Angry 6.1%
Happy 5%
Disgusted 4.1%
Calm 3.9%
Confused 3%

AWS Rekognition

Age 28-38
Gender Female, 99.8%
Happy 56.3%
Calm 26.4%
Surprised 9.9%
Sad 2.5%
Confused 1.9%
Disgusted 1.4%
Fear 1%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 82.2%

Categories

Text analysis

Amazon

17741.
BANK

Google

ררה T3EA8-MAMT8AI
ררה
T3EA8-MAMT8AI